Job Title - AWS Big Data Engineer
Skills - AWS, PySpark & Python
Location - Greater Noida, Pune & Hyderabad
Experience - 5 - 10 years
We at Coforge are hiring AWS Big Data Engineers with the following skillset:
- Design and implement robust, scalable, and highperformance data pipelines on AWS.
- Develop, manage, and maintain ETL workflows using Apache Airflow.
- Write efficient, optimized Python code for data processing and transformation.
- Use SQL to query, manipulate, and analyze largescale datasets.
- Collaborate with crossfunctional teams, including data scientists, analysts, and business stakeholders, to deliver data solutions.
- Ensure data quality, integrity, security, and governance across data platforms.
- Monitor data pipeline performance, troubleshoot issues, and ensure reliability and uptime.
Mandatory Skills (MustHave)
- Extensive handson experience with AWS (minimum 4 years), including services such as S3, Lambda, Glue, Redshift, EMR, and related offerings.
- Strong programming skills in Scala for data manipulation and automation.
- Advanced SQL expertise, including complex querying and performance optimization.
Preferred Qualifications
- Experience with big data technologies such as Apache Spark and Hadoop.
- Familiarity with CI/CD pipelines and DevOps practices.
- Solid understanding of data warehousing concepts and tools.
Share your resume over [Confidential Information]