Search by job, company or skills

IntraEdge

GCP Data Engineer

Save
new job description bg glownew job description bg glow
  • Posted 4 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Website-https://intraedge.com/

Job Description – GCP Data Engineer

Position: GCP Data Engineer

Experience: 3–5 Years

Location: Gurugram-Hybrid

Job Summary

We are looking for a skilled and motivated GCP Data Engineer with 3–5 years of experience in building scalable data pipelines and cloud-based data solutions. The candidate should have strong expertise in Google Cloud Platform (GCP), Python, APIs, and Big Data technologies. The role involves designing, developing, and optimizing data processing systems to support business analytics and data-driven decision-making.

Key Responsibilities

  • Design, develop, and maintain scalable ETL/ELT data pipelines on GCP.
  • Work with Big Data technologies for processing large-scale structured and unstructured datasets.
  • Develop and integrate REST APIs for data ingestion and processing.
  • Build and optimize data workflows using Python.
  • Collaborate with cross-functional teams including Data Analysts, Data Scientists, and Business stakeholders.
  • Ensure data quality, security, governance, and performance optimization.
  • Monitor and troubleshoot data pipelines and cloud infrastructure issues.
  • Participate in code reviews, deployment, and release activities.

Required Skills

  • Strong experience in Google Cloud Platform (GCP) services.
  • Hands-on experience with Python scripting and development.
  • Hands-on experience with Pyspark scripting and development.
  • Experience working with APIs and data integration.
  • Knowledge of Big Data technologies and distributed data processing.
  • Experience with ETL pipeline development and data transformation.
  • Good understanding of SQL and database concepts.
  • Familiarity with cloud data warehousing concepts.

Preferred Skills

  • Experience with GCP services such as BigQuery, Dataflow, Pub/Sub, Cloud Storage, Composer, or Dataproc.
  • Knowledge of CI/CD and DevOps practices.
  • Understanding of data modeling and data architecture.
  • Good problem-solving and communication skills.

Educational Qualification

  • Bachelor's degree in Computer Science, Information Technology, or a related field.

Notice Period

  • Immediate joiners or candidates with short notice period preferred.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147506009

Similar Jobs

Gurugram, Gurugram, India

Skills:

BigQueryHiveGcpPysparkSparkDataprocDataFlowMapreduceGCS

Gurugram, Gurugram, India

Skills:

JavaBigQueryHadoopPysparkSpring BootSqlUNIXTensorflowRestful Web ServicesCloud StorageGitHiveGcpshell scriptingLinuxPerlSparkDataprocDataFlowPythonScikit-learnCloud Composer

Gurugram, Gurugram, India

Skills:

BigQueryHadoopPysparkData WarehousingBashDataprocSparksqlSqlCloud StorageHiveGcpIamDataFlowPythonAirflowDataFramePub Sub

Delhi

Skills:

data engineering SqlPysparkBigQueryDataformDataplex

Noida, India

Skills:

BigQueryELTJenkinsGcpTerraformDataprocData WarehousingDataFlowTalendPythonEtlAirflowPubsubdbt