Search by job, company or skills

enterprise minds, inc

GCP Data Engineer

6-8 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted 2 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Hiring: Senior Data Engineer (GCP + Databricks) | 68 Years | High-Impact Role

Are you passionate about building scalable data platforms and working with cutting-edge cloud technologies

We're looking for a Senior Data Engineer who can turn complex data into powerful business insights.

About the Role

As a Senior Data Engineer, you will design, build, and optimize modern data platforms that power analytics, AI, and business decision-making. You'll work closely with data scientists, analysts, and product teams to deliver reliable and scalable data solutions.

Key Responsibilities

  • Design and build scalable ETL/ELT pipelines using cloud-native technologies
  • Develop and optimize data workflows in GCP (BigQuery, Dataflow, Pub/Sub, Dataplex)
  • Work on Databricks ecosystem including Delta Lake, DLT, and Unity Catalog
  • Implement data modeling techniques (Star Schema, Snowflake, 3NF, denormalized models)
  • Build and maintain high-performance data pipelines for large-scale processing
  • Ensure data quality, governance, and metadata management
  • Collaborate with cross-functional teams to deliver data-driven solutions
  • Support AI/ML pipelines and enable data science use cases
  • Ensure compliance with data security and governance standards (SOX/PCI)

Must-Have Skills

  • 68 years of experience in Data Engineering / Data Platform Development
  • Strong hands-on experience in GCP ecosystem
  • Experience with Databricks (Delta Lake, DLT, Unity Catalog)
  • Expertise in data architectures: Data Lake, Data Warehouse, Lakehouse
  • Strong knowledge of data modeling & pipeline optimization
  • Experience with ETL/ELT frameworks and big data processing
  • Solid understanding of data governance & quality frameworks
  • Good communication skills and ability to translate business requirements into technical solutions

Good to Have

  • Exposure to Data Mesh / Domain-driven architecture
  • Experience with modern data stack (dbt, Terraform, Cloud Composer, Dataplex, Atlan)
  • Knowledge of data cataloging & lineage tools
  • Understanding of FinOps (cloud cost optimization)
  • Exposure to AI/ML pipelines, LLMs, or AI-powered analytics
  • Experience in product-based or large enterprise environments

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 145109677