Job Title: AWS Data Engineer Databricks & Snowflake
Experience: 7+ Years
Location: South India / Pune / Mumbai (Hybrid/Remote as per business needs)
Employment Type: Full-time
Job Summary
We are seeking an experienced AWS Data Engineer with strong expertise in Databricks and Snowflake to design, build, and optimize scalable cloud-based data platforms. The role involves working with large and complex datasets to enable analytics, reporting, and business intelligence initiatives while ensuring performance, scalability, and data quality.
Key Responsibilities
- Design, develop, and maintain end-to-end data pipelines on AWS.
- Build and optimize data processing workflows using Databricks (PySpark, Spark SQL).
- Design, implement, and manage Snowflake data warehouse solutions.
- Ingest data from multiple sources including RDBMS, APIs, flat files, and streaming platforms.
- Optimize data pipelines for performance, scalability, and cost efficiency.
- Implement data quality checks, validation, monitoring, and error-handling mechanisms.
- Collaborate with analytics, BI, and business teams to enable reporting and actionable insights.
- Ensure adherence to data security, governance, and compliance best practices.
- Support production deployments and troubleshoot data pipeline issues as needed.
Required Skills & Qualifications
- 7+ years of hands-on experience in Data Engineering.
- Strong experience with AWS services including S3, Glue, Lambda, EC2, and Redshift.
- Extensive hands-on experience with Databricks.
- Strong expertise in Snowflake data warehousing concepts and implementation.
- Proficiency in Python and SQL.
- Experience with ETL/ELT frameworks and data integration tools.
- Solid understanding of data modeling concepts, preferably dimensional modeling.