Search by job, company or skills

B

DevOps Architect

new job description bg glownew job description bg glownew job description bg svg
  • Posted 10 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

About the Company

We are seeking an experienced DevOps Architect (12+ years of experience) to lead the design, implementation, and governance of enterprise-grade DevOps and Data Platform infrastructure. This role will be responsible for architecting scalable CI/CD frameworks, containerized deployments, secure cloud-native environments, and enterprise data operations across Snowflake and Data coves (DBT + Airflow). The ideal candidate will combine deep DevOps expertise with strong data platform experience and security-first architecture thinking.

About the Role

We are seeking an experienced DevOps Architect (12+ years of experience) to lead the design, implementation, and governance of enterprise-grade DevOps and Data Platform infrastructure.

Responsibilities

  • DevOps Architecture & Strategy
  • Define and implement enterprise DevOps architecture, standards, and best practices.
  • Design highly scalable and resilient CI/CD frameworks using Jenkins.
  • Architect end-to-end DevOps pipelines supporting DBT, Airflow, and Snowflake workloads.
  • Establish DevSecOps practices including automated security scanning and compliance checks.
  • Provide technical leadership, governance, and mentoring to DevOps and Data Engineering teams.
  • Kubernetes & Containerization
  • Architect and manage containerized workloads using Kubernetes.
  • Design Kubernetes cluster strategy (multi-namespace, multi-environment, HA, auto-scaling).
  • Implement Helm charts, deployment strategies (blue-green, canary), and cluster security policies.
  • Optimize container orchestration for Airflow and related data services.
  • Data coves (DBT + Airflow) Platform Architecture
  • Architect and govern DBT transformation layers:
  • Staging models
  • Intermediate models
  • Data marts

  • Implement:
  • Incremental models
  • Snapshot strategies for historical tracking
  • Surrogate key management
  • Architect DevOps workflows within the Data coves framework.
  • Design and govern Airflow DAG orchestration, monitoring, and failure recovery strategies.
  • Implement automated DBT deployment frameworks with version-controlled CI/CD.
  • Establish testing, lineage, and observability standards for data pipelines.
  • Snowflake Administration & Optimization
  • Lead Snowflake platform administration including:
  • Role-based access control (RBAC)
  • Warehouse sizing & performance tuning
  • Resource monitors & cost governance
  • Secure data sharing
  • Implement Snowflake CI/CD integration with Jenkins and DBT.
  • Architect secure network integrations and private connectivity where applicable.
  • Ensure high availability, disaster recovery, and backup strategies.
  • Security & SSL Management
  • Architect and manage SSL/TLS certificate lifecycle management.
  • Implement certificate rotation, renewal automation, and secure endpoint configurations.
  • Ensure secure communication between services, APIs, and data platforms.
  • Enforce encryption standards for data at rest and in transit.
  • Collaborate with InfoSec teams to meet enterprise compliance requirements.
  • Governance, Automation & Observability
  • Implement Infrastructure as Code (IaC) using Terraform/CloudFormation (if applicable).
  • Establish monitoring and observability frameworks for pipelines and infrastructure.
  • Lead root cause analysis, incident response automation, and platform reliability engineering.
  • Drive continuous improvement in DevOps maturity and platform automation.
  • ETL / ELT Orchestration & Automation
  • Design ELT-first architecture leveraging Snowflake processing power.
  • Orchestrate complex workflows using Apache Airflow:
  • DAG dependency management
  • SLA monitoring
  • Automated recovery workflows
  • Implement CI/CD for:
  • DBT deployments
  • Airflow pipelines
  • Snowflake objects
  • Build data observability frameworks (pipeline monitoring, anomaly detection).
  • Qualifications

    • 12+ years of experience in Data Engineering and Enterprise Data Platforms.
    • Deep hands-on expertise in:
    • Jenkins (Pipeline as Code, shared libraries, distributed builds)
    • Kubernetes (cluster architecture, scaling, security)
    • Data coves (DBT + Airflow orchestration)
    • Snowflake Administration
    • SSL/TLS certificate management
    • Strong knowledge of CI/CD, Git branching strategies, and release management.
    • Advanced SQL proficiency and strong understanding of data warehousing principles.
    • Experience with scripting/programming (Python preferred).
    • Experience working in Agile/Scrum environments.

    Required Skills

    • Jenkins (Pipeline as Code, shared libraries, distributed builds)
    • Kubernetes (cluster architecture, scaling, security)
    • Data coves (DBT + Airflow orchestration)
    • Snowflake Administration
    • SSL/TLS certificate management

    Preferred Skills

    • Experience with scripting/programming (Python preferred).

    More Info

    Job Type:
    Industry:
    Function:
    Employment Type:

    About Company

    Job ID: 143399547

    Similar Jobs