We are looking for aSenior Data Engineerwith strongexpertiseinSQL, advanceddata warehousing concepts, and programming skills inPython or Java. The role requires hands-on experience withGCPBigQuery, Dataflow, Cloud Composer, and strong exposure toData governance and data management. All transformations are built usingdbt, so familiarity withdbtdevelopment and best practices is essential. Experience withSnowflakeorDatabricksis a plus.
Responsibilities
- Build scalable data pipelines usingBigQuery, Dataflow, and Cloud Composer.
- Develop ELT transformations and data models usingdbtfollowing modular, testable, and governed practices.
- Apply advanced data warehousing techniques, performance optimization, partitioning, clustering, and data modeling patterns.
- Implement data governance, quality rules, lineage, and metadata workflows.
- Develop production-ready code inPython/Javafor orchestration, automation, and data processing.
- Collaborate with analytics, product, and business teams to define data requirements.
- Ensure secure, reliable, and cost-optimized data pipelines aligned to best practices.
- Support CI/CD, monitoring, troubleshooting, and continuous improvement of pipelines and dbt models.
Qualifications
- 48 years of experiencein data engineering, data warehousing, or analytics engineering roles.
- Strong expertise inSQLandadvanced data warehousing topics(modelling, performance tuning, optimization).
- Hands-on experience withGCP BigQuery, Dataflow, Cloud Composer
- Practical experience developing transformations and models indbt.
- Proficiency inPython or Javafor data processing and automation.
- Strong problem-solving skills, communication, and ability to collaborate with cross-functional teams.
Multiple Openings available for Bigquery/Snowflake/Databricks. Send your resume to [Confidential Information]