Job Title: Senior Data Engineer / ETL Engineer (GCP Cloud)
Location :Pune and Hyd
Experience :5 To 8 , 8 To 11
CTC 21 and 28 Lpa
Job Summary
We are looking for an experienced Senior Data Engineer / ETL Engineer with strong expertise in Google Cloud Platform (GCP) to design, build, and maintain scalable data pipelines and ETL processes. The ideal candidate will work closely with data analysts, data scientists, and business teams to deliver high-quality data solutions.
Key Responsibilities
- Design, develop, and maintain ETL/ELT pipelines for processing large volumes of structured and unstructured data.
- Build scalable data pipelines using GCP services such as **Google BigQuery, Google Cloud Dataflow, Google Cloud Composer, and Google Cloud Storage.
- Develop and optimize data transformation workflows using Apache Airflow and modern ETL frameworks.
- Implement data ingestion from multiple sources such as APIs, databases, and streaming systems.
- Ensure data quality, reliability, and performance optimization of ETL processes.
- Work with data warehousing solutions like Google BigQuery for analytics and reporting.
- Implement data governance, security, and compliance standards in the data platform.
- Collaborate with cross-functional teams to understand business requirements and translate them into technical data solutions.
- Monitor and troubleshoot production data pipelines and workflows.
Required Skills
- Strong experience in ETL/ELT development and data pipeline architecture.
- Hands-on experience with **Google Cloud services such as Google BigQuery, Google Cloud Dataflow, Google Cloud Pub/Sub, and Google Cloud Storage.
- Strong programming skills in Python and SQL.
- Experience with workflow orchestration tools such as Apache Airflow.
- Knowledge of data warehousing concepts and dimensional modeling.
- Experience working with large datasets and distributed data processing frameworks.
- Understanding of CI/CD pipelines and DevOps practices.
Preferred Skills
- Experience with Apache Spark or Apache Beam.
- Knowledge of containerization using Docker and orchestration using Kubernetes.
- Experience with streaming technologies like Apache Kafka.
- Familiarity with data visualization tools such as Looker or Tableau.
Experience
- 5+ years of experience in Data Engineering / ETL Development.
- Minimum 2+ years of experience working with GCP cloud services.
Education
- Bachelor's or Master's degree in Computer Science, Information Technology, Data Engineering, or a related field.