DevOps Engineer Supporting to Data Science Team
What You'll Do
- Architect, deploy, and operate scalable data infrastructure in Azure Cloud.
- Strong experience in Azure Cloud with observability stack.
- Strong hands-on experience in containerized workloads using Docker and Kubernetes (AKS).
- Strong hands-on experience in Airflow and Kubeflow.
- Design and maintain CI/CD pipelines using Jenkins, Github Actions, and Terraform.
- Harden containers and clusters (image scanning, RBAC, secrets, certificates, config management).
- Hands-on experience in Snowflake.
- Own platform operations: upgrades, capacity planning, incident response, and performance tuning.
Required Qualifications
- 6 years with Kubernetes (AKS).
- 6 years hands-on with Azure Cloud.
- 6; years deploying Docker images in Airflow and Kubeflow.
- Strong understanding of Kubernetes architecture.
- Good experience in Agentic AI framework.
- 6; years of Python for automation or data workflows.
- Experience with Cloud Composer (Airflow).
- Proficiency in Linux and scripting (Bash, Python).
Preferred Qualifications
- Experience with Helm, Terraform, Jenkins and Github Actions.
- Exposure to SQL/NoSQL and CloudSQL.
- Hands-on migration of legacy apps to Kubernetes.
- Familiarity with open-source data tooling and GCP services.
- Background in cloud security & compliance.
Tech Stack
Azure ( Composer/Airflow, CloudSQL, Pub/Sub, Artifact Registry)
- Docker
- Kubernetes
- Jenkins
- Github
- Terraform
- Snowflake
- Linux
- Python
- Airflow
- Kubeflow
What To Expect In Interview
Be ready to discuss real-time issues and troubleshooting steps in AKS, Airflow and Kubeflow.
Prepare to walk through CI/CD pipelines you've built or maintained.
Brush up on Azure Services and observability practices.
Problem solving steps / Troubleshooting steps in Kubeflow and Airflow