Search by job, company or skills

I

GCP Data Engineer

new job description bg glownew job description bg glownew job description bg svg
  • Posted 20 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

About us:

Intuitive is an innovation-led engineering company delivering business outcomes for 100's of Enterprises globally. With the reputation of being a Tiger Team & a Trusted Partner of enterprise technology leaders, we help solve the most complex Digital Transformation challenges across following Intuitive Superpowers:

Modernization & Migration

  • Application & Database Modernization
  • Platform Engineering (IaC/EaC, DevSecOps & SRE)
  • Cloud Native Engineering, Migration to Cloud, VMware Exit
  • FinOps

Data & AI/ML

  • Data (Cloud Native / DataBricks / Snowflake)
  • Machine Learning, AI/GenAI

Cybersecurity

  • Infrastructure Security
  • Application Security
  • Data Security
  • AI/Model Security

SDx & Digital Workspace (M365, G-suite)

  • SDDC, SD-WAN, SDN, NetSec, Wireless/Mobility
  • Email, Collaboration, Directory Services, Shared Files Services

Intuitive Services:

  • Professional and Advisory Services
  • Elastic Engineering Services
  • Managed Services
  • Talent Acquisition & Platform Resell Services

About the job:

Title: GCP Data Engineer

Start Date: Immediate

Position Type: Full Time

Location: Ahmedabad, India

Job Summary

We are looking for a skilled GCP Data Engineer with 25 years of hands-on experience in building and maintaining scalable data pipelines on Google Cloud Platform. The ideal candidate will work closely with analytics, product, and engineering teams to enable reliable, high-performance data solutions.

Key Responsibilities

  • Design, build, and maintain ETL/ELT data pipelines on Google Cloud Platform
  • Develop batch and streaming pipelines using BigQuery, Dataflow (Apache Beam), and Pub/Sub
  • Optimize BigQuery queries for performance, scalability, and cost efficiency
  • Manage data ingestion from multiple sources using Cloud Storage (GCS)
  • Orchestrate workflows using Cloud Composer (Apache Airflow)
  • Ensure data quality, consistency, security, and monitoring across pipelines
  • Collaborate with analysts, data scientists, and stakeholders to support data-driven decisions
  • Document data architecture, pipelines, and operational processes

Required Skills & Qualifications

  • 25 years of experience in Data Engineering
  • Strong hands-on experience with Google Cloud Platform (GCP)
  • Expertise in:
  • BigQuery
  • Dataflow (Apache Beam)
  • Cloud Storage (GCS)
  • Strong SQL skills for analytical and large-scale datasets
  • Proficiency in Python (or Java) for data processing
  • Experience with workflow orchestration tools (Airflow / Cloud Composer)
  • Solid understanding of data warehousing and ETL/ELT concepts

Good to Have (Preferred Skills)

  • Experience with Pub/Sub for real-time streaming pipelines
  • Spark / Dataproc experience
  • Infrastructure as Code (Terraform, Deployment Manager)
  • CI/CD pipelines and Git-based workflows
  • GCP Professional Data Engineer certification
  • Exposure to data formats like Parquet, Avro, JSON

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 145322667

Similar Jobs