Job Description
Essential Skills & Experience:Experience Level: 10 + years experience inData Engineering development with end-to-end Data Platform implementation using AWS tech stack. Mandatory Skills:-
Excellent understanding and design experience of Big Data platform, Data Lake, Lakehouse architecture, Medallion architecture and modern data architecture.-
Excellent in SQL/PL-SQL scripting in columnar database-
Excellent Data Modelling experience-
Strong skills in Redshift, SQL, PL-SQL, Glue, PySpark, Spark, S3, Shell scripting-
Strong skills in Airflow, Kinesis, Step function and related ETL-DW data services-
Develop, enhance & optimize data pipelines using AWS service, custom ETL tools and automating data pipelines.-
Experience in code deployment in various environments-
Designing job scheduling, optimizing airflow job orchestration, designing batch/stream monitoring in systems, usage tracking etc.-
Excellent implementation skills in AWS tech stack, ETL & ELT, DW BI, EMR, S3, SQS, Step function, Lambda, Scala.
Good to have skills:Excellent knowledge of the SDLC and Agile methodology (Scrum/Jira).Should be able to work closely within an onshore and offshore model.Strong Team player, good communicator and inter-personal skills.Ability to learn new ETL tools and implement data pipelines.Experience of building unit tests, integration tests, system tests and acceptance testsAn understanding of Power BI reports & visualizations