
Search by job, company or skills
Role AWS Data Engineer
Required Technical Skill Set AWS Redshift, Glue, PySpark
Desired Experience Range 4-10 Years
Location of Requirement Chennai/Pune
Desired Competencies (Technical/Behavioral Competency)
Must-Have Strong hands-on experience in Python programming and PySpark
Experience using AWS services (RedShift, Glue, EMR, S3 & Lambda)
Experience working with Apache Spark and Hadoop ecosystem.
Experience in writing and optimizing SQL for data manipulations
Good Exposure to scheduling tools. Airflow is preferable.
Must Have Data Warehouse Experience with AWS Redshift or Hive
Experience in implementing security measures for data protection.
Expertise to build/test complex data pipelines for ETL processes (batch and near real time) Readable documentation of all the components being developed.
Knowledge of Database technologies for OLTP and OLAP workloads
Good-to-Have Good understanding of Data warehouse and Data Lakes Familiarize ETL tools like Netezza or Informatica. Experience working with NoSQL databases like DynamoDB or MongoDB. Good to have exposure to AWS services (Step Functions, Athena) Data Modelling exposure Familiarity with Investment Banking domain
Job ID: 134678837