Search by job, company or skills

C

GCP Data Engineer

10-20 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted 5 hours ago
  • Be among the first 10 applicants
Early Applicant
Quick Apply

Job Description

Required Qualifications

  • Expert-level Python for backend/data engineering
  • Hands-on GCP: Dataflow, BigQuery, Cloud Composer/Airflow, Cloud Functions, Cloud Run, GCS, IAM
  • Design and development of ETL/ELT pipelines (batch and streaming)
  • Apache Spark for large-scale processing
  • Apache Kafka for messaging/streaming
  • Orchestration with Airflow/Cloud Composer (DAG design, scheduling, monitoring)
  • Strong SQL with enterprise RDBMS (SQL Server, Oracle, PostgreSQL)
  • Git/GitHub and CI/CD for data projects
  • Deployments on GKE and Cloud Run; understanding of autoscaling and load balancers
  • API development with Python FastAPI; APIGEE proxy management
  • Data quality frameworks, validation rules, monitoring/observability
  • Security: IAM roles/policies, data redaction, DLP
  • Data ingestion from varied sources; transformation and cleansing
  • Documentation of pipelines, data flows, and operations
  • Experience migrating on-prem data to cloud

Responsibilities

  • Design, develop, test, and maintain scalable ETL/ELT data pipelines in Python
  • Architect data solutions using Kafka, GCP services (Dataflow, BigQuery, Composer/Airflow, Functions, Cloud Run/GKE, GCS, IAM), dbt, and related tools
  • Implement streaming and batch processing; ensure autoscaling, reliability, and cost efficiency
  • Develop and manage Airflow/Cloud Composer DAGs for orchestration (scheduling, monitoring, alerting)
  • Build APIs with FastAPI and manage APIs via APIGEE; integrate load balancers when required
  • Ingest data from diverse sources; apply transformation, cleansing, and enrichment
  • Implement data quality checks, validation rules, observability, and monitoring
  • Apply security best practices including IAM, data redaction, and DLP
  • Write and optimize complex SQL for extraction, validation, and analytics
  • Manage code in GitHub; implement CI/CD pipelines for data workloads
  • Deploy and operate workloads on GKE and Cloud Run
  • Collaborate with data science/analytics teams to deliver fit-for-purpose data products
  • Create and maintain documentation of pipeline designs, data flows, and operational runbooks
  • Lead or contribute to on-prem to cloud data migration initiatives

More Info

Function:
Employment Type:
Open to candidates from:
Indian

About Company

Welcome to CodeOrion, where innovation meets excellence in software development. We are a dynamic team of passionate professionals dedicated to crafting cutting-edge software solutions that drive business success. At CodeOrion, we believe in the power of technology to transform industries and enhance lives. Our mission is to deliver tailored software solutions that not only meet our clients' unique needs but also exceed their expectations. Whether you are a startup looking to build your first product or an established enterprise aiming to innovate, we are here to help you every step of the way. Our services include: Game Development Custom Software Development Mobile App Development Web Development Cloud Solutions IT Consulting and Support What sets us apart is our commitment to quality, transparency, and continuous improvement. We leverage the latest technologies and best practices to ensure that our solutions are robust, scalable, and secure. Our collaborative approach means we work closely with you to understand your vision and goals, ensuring that the final product aligns perfectly with your business objectives. Join us on this exciting journey of digital transformation. Connect with us today to learn how CodeOrion can help you achieve your technological aspirations.

Job ID: 136089389

Similar Jobs