Job Title: Data Engineer (Airflow, Python, SQL)
Experience: 67 Years
Location: Remote
Employment Type: Contract
Job SummaryWe are seeking a skilled Data Engineer with 67 years of experience to design, build, and maintain scalable data pipelines. The ideal candidate will have strong expertise in Apache Airflow, Python, SQL, and data pipeline monitoring, ensuring reliable data processing and workflow orchestration across the organization.
Key Responsibilities- Design, develop, and maintain scalable and reliable data pipelines for processing large datasets.
- Build and orchestrate data workflows using Apache Airflow.
- Develop efficient ETL/ELT processes using Python and SQL.
- Monitor and troubleshoot data pipelines and workflow failures, ensuring high availability and reliability.
- Implement data quality checks and monitoring frameworks to maintain data integrity.
- Optimize database queries and improve data processing performance.
- Collaborate with data analysts, data scientists, and engineering teams to support analytics and reporting needs.
- Maintain documentation for data architecture, workflows, and processes.
Required Skills- 67 years of experience in Data Engineering or a related role.
- Strong hands-on experience with Apache Airflow for workflow orchestration.
- Proficiency in Python for building data pipelines and automation.
- Advanced knowledge of SQL for data extraction, transformation, and optimization.
- Experience with data pipeline monitoring, debugging, and performance tuning.
- Strong understanding of ETL/ELT processes and data integration techniques.
- Experience working with large-scale data processing environments.
Good to Have- Experience with cloud platforms (AWS, Azure, or GCP).
- Familiarity with data warehouses and big data technologies.
- Knowledge of data governance and data quality frameworks.