Senior AWS Data Engineer,078
Location: Pune, India (Hybrid 3 days onsite)
Experience: 612 years
Employment Type: Full-time
Salary Range: INR 22,00,000 28,00,000 per annum
Status: Active
Role Overview
We are seeking a
Senior AWS Data Engineer to design, develop, and optimize scalable big data solutions. This role involves close collaboration with project leads and stakeholders to translate complex business requirements into robust, end-to-end technical solutions. You will play a key role throughout the project lifecycle, from architecture design to production deployment, while mentoring junior engineers and ensuring delivery of high-quality solutions.
Key Responsibilities
- Design and develop scalable, high-performance big data pipelines
- Collaborate with stakeholders to gather, analyze, and refine technical requirements
- Translate business requirements into end-to-end data engineering solutions
- Lead projects from design through development and production deployment
- Optimize Spark jobs and SQL queries for performance and scalability
- Implement and maintain CI/CD pipelines for data engineering workflows
- Mentor and guide junior engineers, ensuring best practices and quality delivery
Technical Skills
- Strong expertise in Python, PySpark, and Spark architecture, including performance tuning
- Advanced proficiency in SQL, with a strong focus on query optimization
- Extensive experience with AWS data engineering services:
- AWS Glue
- AWS Lambda
- Amazon S3
- API Gateway
- Good working knowledge of Unix/Linux environments
- Hands-on experience with CloudFormation and CI/CD pipelines (preferred)
Required Experience (Must-Have)
- Python / PySpark: 6+ years
- AWS Data Services (Glue, Lambda, API Gateway, S3): 6+ years
- SQL Optimization and Performance Tuning: 6+ years
- Spark Architecture and Distributed Processing: 6+ years
- Unix/Linux Environments: 3+ years
Nice to Have
- Experience with Infrastructure as Code (CloudFormation)
- Exposure to CI/CD tools and automated deployment pipelines
Soft Skills
- Strong analytical and problem-solving abilities
- Excellent communication and collaboration skills
- Ability to work in a fast-paced, dynamic environment
- Proven capability to deliver results within tight timelines
Interview Process
- 2 Technical Rounds
- 1 Stakeholder/Client Interaction Round
Skills: pyspark,spark architecture,linux,sql,api gateway,amazon s3,aws glue,aws lambda,python