Job Title: Data Engineer (Databricks + GCP)
Experience: 5+ years
Location: Remote
Overview:
We are looking for a Data Engineer with strong experience in Databricks and Google Cloud Platform (GCP) to design, build, and optimize scalable data pipelines and Delta Lake environments.
Key Responsibilities:
- Develop and maintain ETL pipelines using PySpark and Databricks
- Manage and optimize Delta Lake architectures
- Perform data integration from various sources into the data platform
- Improve data pipeline performance and reliability
- Write efficient SQL queries for data processing and analytics
- Collaborate with cross-functional teams to deliver high-quality data solutions
Required Skills:
- Strong hands-on experience with PySpark and Databricks
- Experience with GCP services (BigQuery, Cloud Storage, Compute, etc.)
- Deep understanding of Delta Lake
- Solid SQL skills
- Experience with performance tuning and optimization
- Good problem-solving and communication skills