Search by job, company or skills

CloudHire

Lead Architect

8-10 Years

This job is no longer accepting applications

new job description bg glownew job description bg glownew job description bg svg
  • Posted 3 months ago

Job Description

Position Overview

Architect and implement enterprise-grade data migration solutions using Java and Python, enabling seamless data transfers from on-premises to GCP (Cloud Storage, BigQuery, Pub/Sub) using Apache Airflow and Google Cloud Composer.

Build secure, scalable, and optimized data architectures leveraging GCP services such as Cloud Storage, Pub/Sub, Dataproc, Dataflow, and BigQuery.

Design and implement automated frameworks for data delivery, monitoring, and troubleshooting.

Develop data observability frameworks to ensure quality, lineage, and reliability across pipelines.

Proactively monitor system performance, identify bottlenecks, and optimize pipelines for efficiency, scalability, and cost.

Troubleshoot and resolve complex technical issues in distributed systems and cloud environments.

Drive best practices in documentation of tools, architecture, processes, and solutions.

Mentor junior engineers, conduct design/code reviews, and influence engineering standards.

Collaborate with cross-functional teams to enable AI/ML and GenAI-driven use cases on LUMI.

Minimum Qualifications:

  1. 8+ years of experience in data engineering, software engineering, or platform development.
  2. Strong programming expertise in Java, Python, and Shell scripting.
  3. Advanced knowledge of SQL, data modeling, and performance optimization.
  4. Deep expertise in Google Cloud Platform services: Cloud Storage, BigQuery, Pub/Sub, Dataproc, Dataflow.
  5. Strong background in RDBMS (Oracle, Postgres, MySQL) and exposure to NoSQL DBs (Cassandra, MongoDB, or similar).
  6. Proven track record in CI/CD pipelines, Git workflows, and Agile development.
  7. Demonstrated experience in building and scaling production-grade data pipelines.
  8. Strong problem-solving and troubleshooting skills in distributed and cloud-native systems.

Our Client is a global powerhouse in digital transformation, headquartered in France and operating across 68 countries with approximately 74,000 employees. As a European leader in Cybersecurity, Cloud, and High-Performance Computing, it serves as a mission-critical partner for some of the world's most complex industries, including Defense, Healthcare, and Financial Services. In 2025, the company launched its Genesis strategic plan, aiming to streamline its operations into six core business lines (including Data & AI and Cloud Infrastructure) to reach a revenue target of nearly €10 billion by 2028.

Required Skills

Java

python

shell scripting

SQL

GCP

CI/CD

data Modelling

Advanced knowledge of SQL

data modeling

and performance optimization. Deep expertise in Google Cloud Platform services: Cloud Storage

BigQuery

Pub/Sub

Dataproc

Dataflow. Strong background in RDBMS (Oracle

Postgres

MySQL) and exposure to NoSQL DBs (Cassandra MongoDBor similar). Proven track record in CI/CD pipelines

Git workflowsand Agile development. Demonstrated experience in building and scaling production-grade data pipelines. Strong problem-solving and troubleshooting skills in distributed and cloud-native systems.

Key Responsibilities

Architect and implement enterprise-grade data migration solutions using Java and Python, enabling seamless data transfers from on-premises to GCP (Cloud Storage, BigQuery, Pub/Sub) using Apache Airflow and Google Cloud Composer.

Build secure, scalable, and optimized data architectures leveraging GCP services such as Cloud Storage, Pub/Sub, Dataproc, Dataflow, and BigQuery.

Design and implement automated frameworks for data delivery, monitoring, and troubleshooting.

Develop data observability frameworks to ensure quality, lineage, and reliability across pipelines.

Proactively monitor system performance, identify bottlenecks, and optimize pipelines for efficiency, scalability, and cost.

Troubleshoot and resolve complex technical issues in distributed systems and cloud environments.

Drive best practices in documentation of tools, architecture, processes, and solutions.

Mentor junior engineers, conduct design/code reviews, and influence engineering standards.

Collaborate with cross-functional teams to enable AI/ML and GenAI-driven use cases on LUMI.

Qualifications

  • Engineering, MBA/MCA, Certified in GCP, Azure or AWS.

Technical Requirements

Java, Python, Shell scripting, SQL, GCP, CI/CD, data modeling, performance optimization, Cloud Storage, BigQuery, Pub/Sub, Dataproc, Dataflow, Oracle, Postgres, MySQL, Cassandra, MongoDB, Apache Airflow, Google Cloud Composer, Docker, Git, Agile development, data pipelines, troubleshooting, distributed systems, cloud-native systems

Benefits & Rewards

Competitive Salary and Hybrid Model

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 139163403

Similar Jobs

Early Applicant