Search by job, company or skills

eazyway.io

DevOps Engineer | Data Platform

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 19 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

We're Hiring: DevOps Engineer | Data Platform

Eazyway is hiring on behalf of DataPMI (datapmi.com). We are seeking a skilled DevOps Engineer to support and automate our data platform ecosystems. The ideal candidate will have a strong focus on CI/CD for modern data stacks, particularly around Databricks, dbt, Airflow, and Azure DevOps. You will be responsible for building, deploying, and managing scalable, reliable, and automated data engineering pipelines.

Location: Bangalore

Total Positions: 1

Experience: 3+ years

Work Type: Full-time | On-site

Primary Skills (Mandatory)

  • CI/CD in Databricks & dbt – Design and implement automated CI/CD workflows for Databricks notebooks, jobs, and dbt models (e.g., using dbt Cloud or dbt Core with Databricks).
  • Apache Airflow – Manage and deploy DAGs; integrate Airflow with Databricks operators; handle environment and dependency management.
  • Azure DevOps YAML Pipelines – Build and maintain multi-stage YAML pipelines for CI/CD, including variable groups, service connections, and approvals.
  • Databricks Asset Bundles (DABs) – Use DABs to define, deploy, and manage Databricks resources (jobs, notebooks, models, etc.) as code.

Additional DevOps Skills (Nice to Have / Growing Need)

Infrastructure as Code

  • Terraform – Provision Azure resources (Databricks workspaces, ADLS, Key Vaults, VNets, Airflow environments).

CI/CD & Automation

  • GitHub Actions or GitLab CI
  • dbt CI/CD – Integration with dbt Cloud or dbt Core + GitHub + Databricks
  • Unit & integration testing in CI pipelines (e.g., pytest for Databricks notebooks, dbt tests)

Containerization & Orchestration

  • Docker: Containerising Airflow workers, dbt execution environments, or custom Databricks Python wheels
  • Kubernetes (AKS): Running Airflow or custom data apps (helpful but not mandatory)

Monitoring & Observability

  • Azure Monitor, Log Analytics Workspaces
  • Prometheus + Grafana (if Airflow runs on K8S)
  • Databricks Jobs & Cluster monitoring, alerting on pipeline failures

Security & Compliance

  • Azure Key Vault for secrets management
  • Service Principals, Managed Identities, RBAC for Databricks & Airflow

Version Control & Collaboration

  • Git (GitHub / Azure Repos) – Branch strategies (GitFlow / trunk-based) for data code

Key Responsibilities

Build & Maintain CI/CD Pipelines

  • Automate deployment of Databricks jobs, notebooks, libraries, and dbt models.
  • Use Azure DevOps YAML pipelines to promote code across dev test prod.

Manage Databricks Environments

  • Implement Databricks Asset Bundles for repeatable, versioned deployments.
  • Control cluster policies, secrets, and job schedules as code.

Operationalize Airflow

  • Deploy, upgrade, and scale Airflow (e.g., on AKS or Azure Container Instances).
  • Automate DAG deployment via Git + CI/CD.

Collaborate with Data Engineers

  • Enable dbt + Databricks workflows with proper CI (e.g., dbt build --select state: modified).
  • Troubleshoot pipeline failures, performance issues, and dependency conflicts.

Implement IaC & Environment Consistency

  • Use Terraform to spin up and tear down ephemeral environments for testing.

Required Qualifications

  • 3+ years of DevOps experience with a focus on data/analytics platforms
  • Strong hands-on experience with Azure DevOps YAML pipelines
  • Strong experience with Databricks and dbt (Core or Cloud)
  • Working knowledge of Apache Airflow (deployment, not just authoring)
  • Experience with Databricks Asset Bundles
  • Proficiency in Python

Preferred Qualifications

  • Databricks Certified: Data Engineer or Platform Admin
  • Azure DevOps Certification
  • Experience with dbt Cloud APIs and CI jobs
  • Exposure to Delta Lake, Spark tuning, or Unity Catalog

Job Code: EZW-100125

Only 1 Vacancy | Attractive Salary

How to Apply

Step 1: Register as a Job Seeker using the link below:

https://eazyway.io/job-seeker/register

Step 2: Log in after completing the registration

Step 3: Search using Job Code: EZW-100125 (Recruitment Job Section)

Step 4: Submit your application

Apply Here:https://eazyway.io/job-seeker/register

Hurry Up & Sign up to Apply!

Interested or know someone Tag or share this with them

More Info

Job Type:
Industry:
Function:
Employment Type:

About Company

Job ID: 146806659

Similar Jobs