Search by job, company or skills

E

GCP Data Engineer

new job description bg glownew job description bg glownew job description bg svg
  • Posted 20 days ago
  • Be among the first 50 applicants
Early Applicant
Quick Apply

Job Description

We are looking for an experienced Data Engineer with a solid background in data engineering, cloud storage, and modern data platforms. The ideal candidate will be responsible for building scalable data pipelines, ETL/ELT workflows, and data models to support high-quality analytics and reporting.

Key Responsibilities:

  • Design, develop, and optimize scalable data pipelines and ETL/ELT workflows.
  • Build robust data models to support business intelligence and advanced analytics.
  • Work with structured and unstructured data across a range of databases and storage solutions.
  • Collaborate with analytics, product, and engineering teams to support data needs across the organization.
  • Ensure data quality, performance, and governance across the pipeline lifecycle.

Required Skills and Experience:

  • Strong expertise in SQL, including complex joins, stored procedures, and certificate-authenticated queries.
  • Experience with NoSQL databases such as:
  • Firestore
  • DynamoDB
  • MongoDB
  • Proficiency in data warehousing and data modeling using platforms such as:
  • BigQuery (preferred)
  • Redshift
  • Snowflake
  • Hands-on experience with ETL/ELT tools and frameworks, including:
  • Apache Airflow
  • dbt
  • Kafka
  • Apache Spark
  • Proficiency in Python, PySpark, or Scala for data transformation and automation.
  • Strong practical knowledge of Google Cloud Platform (GCP) and its native services.

Preferred Skills:

  • Experience with data visualization tools:
  • Google Looker Studio
  • LookerML
  • Power BI
  • Tableau
  • Exposure to Master Data Management (MDM) systems.
  • Interest or experience in Web3 data and blockchain analytics.

About Company

Job ID: 120571071

Similar Jobs