
Search by job, company or skills
As a Dataops Lead, you will be responsible for managing, design highly scalable and Available solution for data pipelines that provides the foundation for collecting, storing, modelling, and analysing massive data sets from multiple channels.
Responsibilities:
Align Sigmoid with key Client initiatives
o Interface daily with customers across leading Fortune 500 companies to understand strategic requirements
o Connect with VP and Director level clients on a regular basis.
o Travel to client locations
o Ability to understand business requirements and tie them to technology solutions
Strategically support Technical Initiatives
o Design, manage & deploy highly scalable and fault-tolerant distributed components using Bigdata technologies.
o Ability to evaluate and choose technology stacks that best fit client data strategy and constraints
Drive Automation and massive deployments
o Ability to drive good engineering practices from bottom up
o Develop industry leading CI/CD, monitoring and support practices inside the team
o Develop scripts to automate DevOps processes to reduce team effort
o Work with the team to develop automation and resolve issues
Support TB scale pipelines
o Perform root cause analysis for production errors
o Support developers in day-to-day DevOps operations
o Excellent experience in Application support, integration development and data management.
o Design roster and escalation matrix for team
Provide technical leadership and manage it day to day basis
o Guiding DevOps in day-to-day design, automation & support tasks
o Play a key role in hiring technical talents to build the future of Sigmoid.
o Conduct training for technology stack for developers in house and outside
Culture
o Must be a strategic thinker with the ability to think unconventional / out:of:box.
o Analytical and data driven orientation.
o Raw intellect, talent and energy are critical.
o Entrepreneurial and Agile: understands the demands of a private, high growth company.
o Ability to be both a leader and hands on doer.
Qualifications:-
7 - 12 years track record of relevant work experience and a computer Science or a related technical discipline is required
Proven track record of building and shipping large-scale engineering products and/or knowledge of cloud infrastructure such as Azure/AWS preferred
Experience in Python/Java programming or any scripting language
Experience in managing Linux systems, build and release tools like Jenkins
Effective communication skills (both written and verbal)
Ability to collaborate with a diverse set of engineers, data scientists and product managers
Comfort in a fast-paced start-up environment
Preferred Qualification:
Support experience in BigData domain
Architecting, implementing, and maintaining Big Data solutions
Experience with Hadoop ecosystem (HDFS, MapReduce, Oozie, Hive, Impala, Spark, Kerberos, KAFKA, etc)
Experience in container technologies like Docker, Kubernetes & configuration management systems
Relevant and Interested candidates can apply or reply me at [Confidential Information] along with your updated CV and below the information.
CCTC -
ECTC -
Official NP -
Negotiable up to -
Current Location -
Job ID: 136216305