Job Title: GCP Senior Data Engineer/Architect
Experience: 8 - 14 Years
Location: Remote / Bengaluru, Karnataka, India (Must be willing to work in UK Shift)
About The Role
We are seeking a highly experienced and skilled GCP Senior Data Engineer/Architect to join our dynamic team. In this role, you will be instrumental in designing, developing, and implementing robust and scalable data solutions on the Google Cloud Platform (GCP). You will work closely with Architects and Business Analysts, primarily for our US clients, to understand their data requirements and translate them into effective technical solutions. This role demands a strong understanding of data warehousing and data lake principles, extensive hands-on experience with GCP data services, and excellent communication skills.
Responsibilities
- Design and implement scalable and efficient data warehouse and data lake solutions on GCP.
- Architect, develop, orchestrate, and performance-tune complex data pipelines using GCP services and other relevant tools.
- Lead and contribute to ongoing cloud data lake implementation projects.
- Participate in and contribute to cloud migration projects.
- Develop and manage containerized applications using Docker and Google Kubernetes Engine (GKE).
- Write and optimize complex SQL queries.
- Develop data processing and automation scripts using Python.
- Utilize GCP command-line utilities for infrastructure management and automation.
- Work extensively with GCP data services including BigQuery, Bigtable, and Cloud SQL (MySQL and PostgreSQL).
- Implement and manage security and access controls using GCP IAM.
- Utilize GCS for data storage and management.
- Implement event-driven architectures using Pub/Sub.
- Build and manage ETL/ELT processes using Dataflow and DataProc.
- Orchestrate workflows using Composer (Airflow).
- Integrate data from various sources using data ingestion services like Airbyte and Fivetran (or similar).
- Collaborate effectively with Architects and Business Analysts to gather requirements and provide technical expertise.
- Work independently as an individual contributor or collaboratively as part of a team, adapting to changing project needs.
- Provide constructive feedback and be receptive to feedback from others.
Technical Skills
Must Have Strong Hands-on Experience On:
- Strong understanding of data warehouse and data lake design and implementation principles.
- Proven experience in designing, developing, orchestrating, and performance tuning data pipelines.
- Experience working on at least one cloud migration project.
- Demonstrated experience working on an ongoing cloud data lake implementation project for a minimum of 2 years.
- Solid understanding of Docker and containerized applications, with hands-on experience using GKE.
- Proficiency in SQL for data querying and manipulation.
- Strong scripting skills in Python for data processing and automation.
- Experience with GCP command-line utilities.
- Extensive hands-on experience with:
- GCP BigQuery
- GCP Bigtable
- GCP Cloud SQL (managed MySQL and PostgreSQL)
- GCP IAM
- GCP Cloud Storage (GCS)
- GCP Pub/Sub
- GCP Dataflow
- GCP Dataproc
- GCP Composer (Airflow)
- Experience working with data ingestion services such as Airbyte, Fivetran, or similar tools.
Good To Have
- Knowledge of data governance principles and implementation in GCP.
- Experience with dbt (Data Build Tool).
- Experience with AlloyDB.
Personality Traits
- Strong communication and requirement gathering skills to effectively interact with Architects and Business Analysts for US clients.
- Ability to work effectively as an individual contributor and collaboratively within a team, adapting to project demands.
- Willingness to work in UK shift timings.
- Open to receiving and providing constructive feedback.