Search by job, company or skills

Cloudesign

GCP Data Engineer with Python

new job description bg glownew job description bg glownew job description bg svg
  • Posted 4 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Chennai/Bangalore (HYBRID)

Core Responsibilities

  • Data Engineering & Cloud Development (Core Responsibilities)

Design, build, and maintain scalable data processing systems on Google Cloud Platform (GCP).

Develop end to end data pipelines to support ingestion, transformation, and storage.

Build ELT/ETL workflows, database objects, and cloud native data orchestration solutions.

Optimize pipelines for performance, scalability, cost efficiency, and reliability.

Ensure robust monitoring, alerting, and data quality assurance across distributed systems.

Implement secure data architectures following Trane Technologies standards.

  • Collaboration & Analytics Enablement

Work closely with data analysts and data scientists to understand business requirements and translate them into scalable data solutions.

Support analytical use cases, data modelling, and enable self service data consumption.

Partner across global teams, including business stakeholders, product owners, and data governance.

  • DevOps / Platform Engineering Partnership

Partner with DevOps and platform engineering teams to ensure data infrastructure is secure, reliable, and highly available.

Apply working knowledge of CI/CD practices (code versioning, automation, testing frameworks) to support cloud based deployments.

Collaborate on environments, pipelines, and promote code through dev test prod.

  • Engineering Excellence & Ways of Working

Participate in code reviews and maintain global coding standards.

Follow structured engineering methods, documentation practices, and release processes.

Contribute effectively in distributed, agile development teams across multiple time zones.

Communicate clearly with both technical and nontechnical stakeholders.

  • Innovation & Continuous Improvement

Research, evaluate, and propose new tools, technologies, and design patterns.

Stay current with cloud, data engineering, and analytics trends relevant to enterprise scale environments.

Apply continuous improvement practices to enhance reliability, quality, and developer productivity.

Education Requirements

Bachelors /Masters degree in Computer Science, Information Technology, Data Engineering, Software Engineering, or a related technical field.

Experience Requirements

5+ years of total experience in data engineering, data platforms, or datacentric software engineering.

5+ years of hands on experience designing and building data pipelines, ETL/ELT workflows, and cloud native data solutions.

3+ years of direct experience with Google Cloud Platform (GCP) using services such as Big Query, Cloud Storage, Cloud Run, Datafusion, Dataproc, Composer, and Pub/Sub.

Advanced proficiency in SQL, Python, and PySpark for data processing and transformation.

Proven experience developing and supporting production grade, end to end cloud data pipelines.

Expertise in ELT/ETL design, performance optimization, and data transformation frameworks.

Strong background in monitoring, logging, alerting, and data quality frameworks across distributed systems.

Experience with DevOps tools and techniques, including CI/CD, GitHub, and related automation practices.

Demonstrated experience working in global, cross functional, and agile development environments.

Excellent communication, problem solving, and analytical skills with the ability to collaborate across technical and nontechnical teams.

GCP certification preferred (e.g., Google Cloud Professional Data Engineer).

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 143925331