Search by job, company or skills

Info Way Solutions

Data Ingestion Engineer

new job description bg glownew job description bg glownew job description bg svg
  • Posted a day ago
  • Be among the first 10 applicants
Early Applicant

Job Description

The Senior Data Ingestion Developer is responsible for designing, building, and maintaining scalable and reliable data ingestion pipelines that move data from multiple sources into enterprise data platforms. This role focuses on developing robust data pipelines, ensuring data quality, and enabling efficient data availability for analytics, reporting, and machine learning initiatives.

Key Responsibilities

  • Design and develop high-performance data ingestion pipelines for batch and real-time data processing.
  • Integrate data from various sources such as APIs, databases, files, streaming platforms, and third-party systems.
  • Build and maintain scalable ETL/ELT frameworks and reusable ingestion components.
  • Implement data validation, monitoring, and error-handling mechanisms to ensure high data quality and reliability.
  • Optimize data ingestion performance and scalability for large-volume datasets.
  • Work closely with data engineers, data architects, and analytics teams to define data ingestion standards and best practices.
  • Develop and maintain data pipeline documentation, data lineage, and metadata management.
  • Troubleshoot pipeline failures and implement proactive monitoring solutions.
  • Ensure compliance with data governance, security, and privacy standards.

Required Qualifications

  • Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or a related field.
  • 6+ years of experience in data engineering or data pipeline development.
  • Strong experience building data ingestion pipelines in cloud or distributed environments.
  • Proficiency in Python, Scala, or Java for data processing.
  • Hands-on experience with ETL/ELT tools and frameworks.
  • Experience with SQL and NoSQL databases.
  • Knowledge of data modeling and data warehousing concepts.
  • Strong understanding of API integrations and data extraction techniques.

Preferred Skills

  • Experience with streaming platforms (e.g., Kafka).
  • Hands-on experience with cloud platforms (AWS, Azure, or GCP).
  • Experience with data orchestration tools (Airflow, Prefect, etc.).
  • Familiarity with big data technologies such as Spark or distributed processing frameworks.
  • Experience with CI/CD pipelines and DevOps practices.
  • Knowledge of data governance and data quality frameworks.

Key Competencies

  • Advanced problem-solving and analytical skills
  • Scalable system design mindset
  • Strong collaboration with cross-functional teams
  • Performance optimization and troubleshooting expertise
  • Documentation and knowledge sharing

Success Metrics

  • Reliability and performance of ingestion pipelines
  • Data availability and timeliness for downstream systems
  • Reduction in pipeline failures and data quality issues
  • Scalability of ingestion architecture

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 144186103