
Search by job, company or skills
Role: Databricks Developer
Location – Pune / Gurgaon
Experience: 4–8 years
Employment Type: Full-time
We're hiring a Databricks Developer to build scalable data pipelines and analytics solutions on the Databricks Lakehouse platform.
Roles & Responsibility:
1. Build data pipelines: Design and develop ETL/ELT pipelines using PySpark, SQL, and Databricks workflows for structured and semi-structured data.
2. Optimize performance: Tune Spark jobs, manage partitioning, caching, and cluster configurations to reduce cost and runtime.
3. Data modeling & storage: Implement Delta Lake tables, manage schema evolution, and ensure data quality with Unity Catalog.
4. Collaborate & deliver: Work with data analysts, scientists, and engineering teams to translate requirements into production-ready solutions.
Required Skills:
Nice to have
Job ID: 147196291
Skills:
Git, Gcp, Pyspark, Databricks, Azure, Sql, AWS, Delta Lake, Structured Streaming
Skills:
bigdata, Pyspark, Python, Aws, Databrick
Skills:
Machine Learning, Data Engineer, Aws
Skills:
Databricks, Spark, Python, Sql, Scala, Etl, Azure, Data Warehousing, Data Modeling, Devops
We don’t charge any money for job offers