Role - Data Engineer
Experience - 6 to 11 years only
Locations - Hyderabad or Bangalore or Noida or Chennai
Required Notice Period - Immediate Joiners or 30 days
Interview Process
L1 - Bare Bot Test
L2 - Face-to-Face at client location
Must Have Skills
- 6 to 11 years of overall experience
- Should have working experience with Python
- Should have working experience with PySpark
- Should have working experience with AWS (Redshift or S3 or Glue or Lambda)
- Should have working experience with SQL
- Should have working experience with Airflow
Data Engineer: Role Summary:
Supports data ingestion, transformations, and Redshift data pipeline development. Focuses on data validation, logging, and error handling.
- Strong experience in Pyspark, Python and SQL (MUST have)
- Working knowledge of AWS Glue, Redshift, and S3 (MUST have)
- Familiarity with JSON parsing and flattening (MUST have)
- Understanding of data pipeline error handling and logging (MUST have)
- Apache Airflow (DAG writing and scheduling (MUST have)
- Version control with GIT (MUST have)
- EMR exposure is good to have
- AWS (S3, Glue, Redshift) is (MUST Have) (Glue is optional, though)
- Python (pandas, json, logging) (MUST Have)
- GitLab, Airflow
To apply, connect with Abhishek via [Confidential Information] or WhatsApp on 9154908075