Search by job, company or skills

axim digitech

GCP Data Engineer

Save
new job description bg glownew job description bg glow
  • Posted 3 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Profile:

Migration & Architecture

  • Lead Big Data platform migrations from on‑prem Hadoop / legacy data platforms to GCP
  • Design target-state architectures using BigQuery, Dataproc, Dataflow, GCS
  • Define migration strategies (re-host, re-platform, re-engineer)
  • Modernize ETL pipelines to cloud-native or Spark-based solutions

Development & Engineering:

  • Architect and develop large-scale PySpark applications
  • Refactor existing Spark, Hive, or MapReduce jobs for GCP
  • Optimize data pipelines for performance, scalability, and cost
  • Implement batch and streaming workloads on GCP

Candidate's Profile:

  • BE/B Tech, BCA/MCA with 5+ Years experience as a GCP/Big Data Engineer in Design target-state architectures using BigQuery, Dataproc, Dataflow, GCS
  • Must have experience in Big Data platform migrations from on‑prem Hadoop / legacy data platforms to GCP
  • Ready to work in Bangalore
  • Ready to join within 15 days

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147507805

Similar Jobs

Gurugram, Gurugram, India

Skills:

PysparkApisSqlPythonBig Data Technologiescloud data warehousingETL pipeline development

Gurugram, Gurugram, India

Skills:

JavaBigQueryHadoopPysparkSpring BootSqlUNIXTensorflowRestful Web ServicesCloud StorageGitHiveGcpshell scriptingLinuxPerlSparkDataprocDataFlowPythonScikit-learnCloud Composer

Gurugram, Gurugram, India

Skills:

BigQueryHadoopPysparkData WarehousingBashDataprocSparksqlSqlCloud StorageHiveGcpIamDataFlowPythonAirflowDataFramePub Sub

Delhi

Skills:

data engineering SqlPysparkBigQueryDataformDataplex

Noida, India

Skills:

BigQueryELTJenkinsGcpTerraformDataprocData WarehousingDataFlowTalendPythonEtlAirflowPubsubdbt