Search by job, company or skills

E

GCP Data Engineer

5-7 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted 6 days ago
  • Over 50 applicants
Quick Apply

Job Description

We are looking for an experienced Data Engineer with a strong background in data engineering, storage, and cloud technologies. The role involves designing, building, and optimizing scalable data pipelines, ETL/ELT workflows, and data models for efficient analytics and reporting. The ideal candidate must have strong SQL expertise and hands-on experience with cloud platforms, particularly Google Cloud Platform (GCP).

Key Responsibilities

  • Design, build, and optimize scalable data pipelines.
  • Develop and manage ETL/ELT workflows and data models.
  • Write complex SQL queries, including joins, stored procedures, and certificate-auth-based queries.
  • Work with NoSQL databases such as Firestore, DynamoDB, or MongoDB.
  • Develop and maintain data models and warehousing solutions using platforms like BigQuery (preferred), Redshift, or Snowflake.
  • Build and manage ETL/ELT pipelines using tools like Airflow, dbt, Kafka, or Spark.
  • Use scripting languages such as PySpark, Python, or Scala to create data processing jobs.
  • Collaborate with data analysts and other teams to support their data needs.

Required Skills

  • Strong SQL expertise.
  • Experience with NoSQL databases (Firestore, DynamoDB, or MongoDB).
  • Proficiency in data modeling and data warehousing solutions (BigQuery, Redshift, or Snowflake).
  • Hands-on experience with ETL/ELT pipelines and orchestration tools (Airflow, dbt, Kafka, or Spark).
  • Proficiency in PySpark, Python, or Scala.
  • Strong hands-on experience with Google Cloud Platform (GCP).

Good-to-Have Skills

  • Experience with visualization tools like Google Looker Studio, LookerML, Power BI, or Tableau.
  • Exposure to Master Data Management (MDM) systems.
  • Interest in Web3 data and blockchain analytics.

About Company

Job ID: 120345885

Similar Jobs