Search by job, company or skills

Google

Senior DataOps Architect (GCP)

10-12 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted 7 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Overview

We are looking for a Senior DataOps Architect to help define and lead the architectural direction of our largescale data platforms. In this role, you will design modern, scalable data infrastructure across hybrid onprem and cloud environments, driving best practices in automation, reliability, and DataOps across the organization. You will partner closely with engineering and leadership teams to deliver highperformance, secure, and globally scalable data solutions.

Key Responsibilities

  • Design and implement largescale, highavailability data platform solutions across hybrid environments (onprem + GCP primary, Azure secondary)
  • Architect DataOps pipelines and automation workflows using Terraform, Ansible, and other InfrastructureasCode frameworks
  • Lead the endtoend design and lifecycle management of data infrastructure running across onpremises data centers and cloud platforms
  • Establish and enforce DataOps best practices including CI/CD for data pipelines, automated testing, and data quality frameworks
  • Define and guide cloud migration strategies from onprem to GCP, including hybrid patterns and data residency considerations
  • Build and optimize data infrastructure leveraging Kubernetes, container orchestration, and service mesh technologies
  • Develop highquality technical and architectural documentation for data infrastructure and DataOps workflows
  • Evaluate and introduce modern tooling, platforms, and methodologies to improve reliability, efficiency, and developer velocity
  • Mentor engineering teams on DataOps principles, cloudnative architectures, and infrastructure automation best practices
  • Collaborate with platform, data, and security teams to ensure governance, compliance, and operational excellence
  • Stay up to date on DataOps, cloud, and data infrastructure technologies, proactively recommending improvements

Basic Qualifications

  • Bachelor's degree in Computer Science, Data Engineering, or related technical field
  • 10+ years of experience in Infrastructure, DevOps, or DataOps roles
  • Proven experience architecting and operating onprem data infrastructure and hybrid cloud environments
  • Deep expertise with GCP datarelated services (BigQuery, Dataflow, Composer, GKE, Cloud Storage, IAM) 5+ years
  • Strong handson background with onprem technologies including bare metal, virtualization (VMware, KVM), storage, and networking 5+ years
  • Expertlevel proficiency with Kubernetes, Helm, and service mesh frameworks (Istio, Linkerd) in production environments
  • Experience with modern orchestration tools such as Apache Airflow, Prefect, or Dagster
  • Familiarity with monitoring and observability stacks (Prometheus, Grafana, ELK, Datadog)
  • Experience with GitOps and CI/CD platforms (GitLab CI, GitHub Actions, Jenkins)
  • Understanding of data security, compliance, and disaster recovery planning
  • Advanced proficiency with Terraform, Ansible, and CloudFormation
  • Strong communication skills and the ability to explain complex technical concepts to diverse stakeholders
  • Experience with Azure services and hybrid connectivity technologies (ExpressRoute, VPN) is a plus

More Info

Job Type:
Industry:
Function:
Employment Type:

About Company

Job ID: 144089907

Similar Jobs