Search by job, company or skills

AuxoAI

Senior Data Engineer - Senior Data Engineer - Google Cloud PlatformF

new job description bg glownew job description bg glownew job description bg svg
  • Posted a day ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Description

AuxoAI is seeking a Senior Data Engineer to lead the design, development, and optimization of modern data pipelines and cloud-native platforms using Google Cloud Platform (GCP).

This role is ideal for someone with deep experience building scalable batch and streaming data workflows, strong hands-on engineering skills, and a drive to mentor junior engineers.

You'll work closely with cross-functional teams to build production-grade pipelines using tools like BigQuery, Dataflow, Pub/Sub, Cloud Composer, and Dataform, enabling high-quality data delivery and analytics at scale.

Responsibilities

  • Build and optimize batch and streaming data pipelines using Apache Beam (Dataflow)
  • Design and maintain BigQuery datasets using best practices in partitioning, clustering, and materialized views
  • Develop and manage Airflow DAGs in Cloud Composer for workflow orchestration
  • Implement SQL-based transformations using Dataform (or dbt)
  • Leverage Pub/Sub for event-driven ingestion and Cloud Storage for raw/lake layer data architecture
  • Drive engineering best practices across CI/CD, testing, monitoring, and pipeline observability
  • Partner with solution architects and product teams to translate data requirements into technical designs
  • Mentor junior data engineers and support knowledge-sharing across the team
  • Contribute to documentation, code reviews, sprint planning, and agile ceremonies

Requirements

  • 5+ years of hands-on experience in data engineering, with at least 3 years on GCP
  • Proven expertise in BigQuery, Dataflow (Apache Beam), Cloud Composer (Airflow)
  • Strong programming skills in Python and/or Java
  • Experience with SQL optimization, data modeling, and pipeline orchestration
  • Familiarity with Git, CI/CD pipelines, and data quality monitoring frameworks
  • Exposure to Dataform, dbt, or similar tools for ELT workflows
  • Solid understanding of data architecture, schema design, and performance tuning
  • Excellent problem-solving and collaboration skills

Bonus Skills

  • GCP Professional Data Engineer certification
  • Experience with Vertex AI, Cloud Functions, Dataproc, or real-time streaming architectures
  • Familiarity with data governance tools (e.g., Atlan, Collibra, Dataplex)
  • Exposure to Docker/Kubernetes, API integration, and infrastructure-as-code (Terraform)

(ref:hirist.tech)

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 134322639