
Search by job, company or skills
Senior Data Engineer
Role: Senior Data Engineer Location: Bangalore
Years of experience: 7+ Years
Shift: General Shift (3 days WFO)
Responsibilities: Design and implement GCP-native data pipelines using services such as
BigQuery, Dataflow, and Pub/Sub. Leverage Cloud Composer for orchestration and automate workflows across GCP services.
• Understanding of the data architecture - from data ingestion at the source, through transformation and storage, to final consumption by analytics, reporting, or machine learning systems.
• Design and implement data pipelines and ETL/ELT pipelines to move and transform data from various sources (APIs, databases, logs, IoT, files)
• Ensure ETL processes are efficient, scalable, and optimized for performance
• Implement measures to ensure accuracy, integrity, and consistency of data throughout the ETL process
• Design and optimize data models for structured and unstructured data
• Implement and manage scalable cloud data lakes and warehouses
• Collaborate with cross-functional teams to understand data requirements and deliver solutions
• Troubleshoot and resolve performance and reliability issues across the data platform
• Involved in automating workflows using CI/CD (e.g., GitHub Actions, Jenkins)
Mandatory Skills:
Extensive experience with Google Cloud Platform (GCP) /Azure/AWS including BigQuery, Dataflow, Pub/Sub, Cloud Composer, Vertex AI, Cloud Storage, Cloud Functions. Ability to design and implement GCP-native data engineering solutions.
• 7+ years for Data Modeling and ETL Development Experience
• Experience with cloud data warehouse such as BigQuery
• Strong SQL skills experience and experience with one or more programming languages (Python, Java, Scala)
• Experience with big data technologies (Hadoop, Spark, Kafka)
• Hands-on knowledge of GCP cloud platform
• Experience with CI/CD pipelines and DevOps practices
• Familiarity with data modeling, warehousing, and governance principles Preferred Skills: Hands-on experience with GCP-native tools and services for data engineering and orchestration.
• Certifications in GCP or cloud data engineering
• Familiarity with tools like Looker and dbt
Job ID: 147477161
Skills:
data engineering , Python, Pyspark, AWS Glue, Docker, AWS Batch
Skills:
snowflake , Sql, Databricks, ELT, Java, Apache Spark, Etl, AWS, Python, Azure, Gcp, Scala, Trino, Apache Iceberg
Skills:
snowflake , Sql, Databricks, ELT, Java, Apache Spark, Etl, AWS, Python, Azure, Gcp, Scala, Trino, Apache Iceberg
Skills:
Pyspark, Spark, Big Data, Python, Object-oriented programming
Skills:
Java, Data Modeling, Scala, Sql, Gcp, Docker, Spark, Data Warehousing, Databricks, Azure, Python, AWS, GenAI, ML Pipelines, Streaming Data, Llm, DataOps
We don’t charge any money for job offers