Skill: Data Engineer (PostgreSQL + GKE + Python)
Experience: 5.5 to 10 years
Location: Hyderabad / Pune
Work Mode: Hybrid
Job Summary:
We are seeking a highly skilled Data Engineer with strong expertise in PostgreSQL, Python, and Google Kubernetes Engine (GKE) to build and manage scalable data pipelines and cloud-native data platforms. The ideal candidate will have hands-on experience in data processing, database optimization, and deploying data workloads in Kubernetes environments.
Key Responsibilities:
- Design, build, and maintain scalable data pipelines using Python
- Develop and optimize PostgreSQL databases for large-scale data processing
- Deploy and manage data workloads on GKE (Google Kubernetes Engine)
- Build robust ETL/ELT pipelines for data ingestion, transformation, and loading
- Ensure data quality, integrity, and availability across systems
- Work closely with data analysts, data scientists, and business teams to support data needs
- Implement data models and optimize query performance
- Automate workflows and deployments using CI/CD practices
- Monitor and troubleshoot data pipeline and infrastructure issues
Required Skills:
- Strong experience in Python for data engineering (Pandas, PySpark is a plus)
- Hands-on experience with PostgreSQL (data modeling, performance tuning, indexing)
- Experience with GKE / Kubernetes for deploying and managing data pipelines
- Solid understanding of ETL/ELT concepts and data warehousing
- Experience with cloud platforms (preferably GCP)
- Familiarity with Docker and containerization
- Strong SQL skills and performance optimization techniques
Good to Have:
- Experience with BigQuery, Airflow, or Cloud Composer
- Knowledge of streaming tools like Kafka / Pub-Sub
- Exposure to data governance and data quality frameworks
- Experience with monitoring tools like Prometheus, Grafana