Search by job, company or skills

  • Posted 18 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Title: Data Ops Technical Specialist

Location: Chennai, India (Onsite)
Duration: 6 Months Contract - Extendable
Experience:5+ Years

Role Summary

We are seeking a Data Ops Technical Specialist to support and operate enterprise-scale data platforms and ETL ecosystems. The role focuses on monitoring, scheduling, troubleshooting, optimizing, and supporting data pipelines across batch and real-time environments. You will work closely with engineering, DBA, infrastructure, and business teams to ensure reliable and high-performing data operations.

Key Roles & Responsibilities
  • Continuously monitor ETL jobs, workflows, and data pipelines to ensure performance, availability, and reliability.

  • Manage and validate job scheduling, ensuring all data processes run as per defined SLAs.

  • Troubleshoot and resolve issues related to ETL jobs, data integration, and workflows in a timely manner.

  • Perform root cause analysis (RCA) for recurring incidents and implement preventive and corrective actions.

  • Tune and optimize ETL jobs and workflows for performance, scalability, and efficiency.

  • Monitor and manage system resources including CPU, memory, and storage utilization.

  • Ensure data accuracy, completeness, and consistency through validation and quality checks.

  • Implement and manage error handling and recovery mechanisms within ETL processes.

  • Perform data reconciliation to ensure source and target data consistency.

  • Deploy ETL code, configurations, and workflow changes into production following change management procedures.

  • Develop automation scripts and tools to reduce manual effort and improve operational efficiency.

  • Collaborate with development teams, DBAs, system administrators, and business users to resolve issues and implement enhancements.

  • Provide regular status reports on system health, performance metrics, and incidents to stakeholders.

  • Maintain up-to-date documentation for ETL jobs, workflows, processes, and standard operating procedures (SOPs).

Technical Requirements
  • Strong hands-on experience with RDBMS and SQL, working across multiple database technologies (SQL and NoSQL).

  • Experience designing, building, and supporting big data pipelines for both batch and real-time processing.

  • Hands-on experience with Big Data platforms and tools, including:

    • Hadoop ecosystem

    • Spark

    • Kafka

    • Cloudera

    • Google BigQuery / Dataproc

    • Informatica

  • Practical experience with ETL tools such as Informatica and Talend, along with modern big data processing tools.

  • Experience with data pipeline/workflow orchestration and data quality tools.

  • Knowledge of real-time data ingestion and streaming architectures.

  • Proficiency in one or more programming languages: Python, Java, Scala, C++, or similar.

Qualifications
  • Education: Bachelor's degree in Computer Science, Engineering, or equivalent (Master's degree preferred).

  • Experience: 35 years of experience in Data Engineering, Data Operations, or Data Platform roles.

  • Strong ability to work collaboratively with technical and business stakeholders.

  • Excellent analytical, problem-solving, communication, and interpersonal skills.

  • Ability to work in a fast-paced, operationally intensive environment.

More Info

Job Type:
Industry:
Employment Type:

Job ID: 141650337