Search by job, company or skills

S

Gcp Data Engineer

6-8 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted a day ago
  • Be among the first 10 applicants
Early Applicant
Quick Apply

Job Description

Key Responsibilities:

  • Data Pipeline Development: Design, develop, and maintain scalable batch and streaming data pipelines on Google Cloud Platform (GCP).
  • Big Data Tools: Work extensively with BigQuery, Dataflow, Pub/Sub, Cloud Composer (Airflow), and Dataplex for data integration and orchestration.
  • Programming: Write optimized SQL and Python code for data processing and transformations.
  • Streaming & Batch Processing: Build and manage pipelines using Apache Beam for both batch and streaming workloads.
  • CI/CD & Infrastructure: Implement CI/CD workflows and Infrastructure as Code using GitHub Actions.
  • Governance & Security: Ensure data governance, security, observability, and monitoring of data pipelines.
  • Performance & Cost Optimization: Optimize pipelines for performance, scalability, and cost efficiency.
  • AI/ML Integration: Integrate data pipelines with Vertex AI to support AI/ML use cases.
  • Collaboration: Work closely with cross-functional engineering and data teams to ensure reliable delivery of data solutions.

More Info

Job Type:
Function:
Employment Type:
Open to candidates from:
Indian

About Company

Job ID: 143278923

Similar Jobs

Early Applicant