Roles & Responsibilities:
- Provides expert level development system analysis design and implementation of applications using AWS services specifically using Python for Lambda
- Translates technical specifications and/or design models into code for new or enhancement projects (for internal or external clients).
- Develops code that reuses objects is well-structured includes sufficient comments and is easy to maintain
- Provides follow up Production support when needed. Submits change control requests and documents.
- Participates in design code and test inspections throughout the life cycle to identify issues and ensure methodology compliance.
- Participates in systems analysis activities including system requirements analysis and definition e.g. prototyping.
- Participates in other meetings such as those for use case creation and analysis.
- Performs unit testing and writes appropriate unit test plans to ensure requirements are satisfied.
- Assists in integration systems acceptance and other related testing as needed.
- Ensures developed code is optimized in order to meet client performance specifications associated with page rendering time by completing page performance tests.
Technical Skills Required:
- Experience in building large scale batch and data pipelines with data processing frameworks in AWS cloud platform using PySpark (on EMR) & Glue ETL
- Deep experience in developing data processing data manipulation tasks using PySpark such as reading data from external sources merge data perform data enrichment and load in to target data destinations.
- Experience in deployment and operationalizing the code using CI/CD tools Bit bucket and Bamboo
- Strong AWS cloud computing experience.
- Extensive experience in Lambda S3 EMR Redshift
- Should have worked on Data Warehouse/Database technologies for at least 8 years.
- Any AWS certification will be an added advantage.