Search by job, company or skills

C

Data Engineer - GCP

5-10 Years
10 - 20 LPA
new job description bg glownew job description bg glownew job description bg svg
  • Posted 11 hours ago
  • Be among the first 40 applicants
Early Applicant
Quick Apply

Job Description

For GCP requirement, GCP hands on exp in mandatory including Data Flux, Fusion and Big Query.

• Overall 8 to 12 years of experience .

• Strong experience in Python and SQL for data engineering and data processing.

• Hands-on experience with Apache Spark / PySpark for large-scale data processing.

• Experience building and orchestrating pipelines using Apache Airflow.

• Experience working with Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, Dataproc, Composer, Data Fusion, Cloud Build etc

• Understanding of data modeling techniques (dimensional modeling, data warehouse concepts).

• Experience designing scalable ETL/ELT pipelines and data architectures.

• Strong problem solving, debugging and performance optimization skills for large datasets.

 

Job Description:

 

Position Summary Experienced Senior Data Engineer utilizing Big Data & Gogle Cloud technologies to develop large scale, on-cloud data processing pipelines and data warehouses. What you'll do

• Design and develop scalable data pipelines using SQL, Python, PySpark, and Apache Airflow on Google Cloud Platform (GCP).

• Perform data modeling to support analytics and downstream applications. • Architect and implement custom cloud solutions that integrate Adobe platforms in a scalable, reliable, and high-performance manner.

• Deliver complex, large-scale, enterprise-grade cloud data engineering and integration solutions through hands-on development and technical leadership. Requirements

• Overall 5 to 10 years of experience .

• Strong experience in Python and SQL for data engineering and data processing.

• Hands-on experience with Apache Spark / PySpark for large-scale data processing.

• Experience building and orchestrating pipelines using Apache Airflow.

• Experience working with Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, Dataproc, Composer, Data Fusion, Cloud Build etc • Understanding of data modeling techniques (dimensional modeling, data warehouse concepts).

• Experience designing scalable ETL/ELT pipelines and data architectures.

• Strong problem solving, debugging and performance optimization skills for large datasets. Good to have

• Experience of consulting India customers and BFSI Domain.

• Experience with Adobe Experience Platform and/or Adobe Campaign Classic.

• Multi-cloud expertise preferable GCP, Azure and AWS

More Info

Job Type:
Function:
Employment Type:
Open to candidates from:
Indian

Job ID: 145634061

Similar Jobs