Your Job:
- Understand the business case and translate to a holistic a solution involving AWS Cloud Services , PySpark, EMR, Python, Data Ingestion and Cloud DB Redshift / Postgres
- PL/SQL development for high volume data sets.
- Experience in preparing data warehouse design artifacts based on given requirements (ETL framework design, data modeling, source-target-mapping), DB query monitoring for tuning and optimization opportunities
- Proven experience with large, complex database projects in environments producing high-volume data
- Demonstrated problem solving skills; familiarity with various root cause analysis methods; experience in
- documenting identified problems and determined resolutions.
- Makes recommendations regarding enhancements and/or improvements
- Provides appropriate consulting, interfacing, and standards relating to database management, and monitors transaction activity and utilization.
- Performance issues analysis and Tuning
- Data Warehouse design and development, including logical and physical schema design.
Other Responsibilities:
Perform all activities in a safe and responsible manner and support all Environmental, Health, Safety &
Security requirements and programs
Customer/stakeholder focus. Ability to build strong relationships with Application teams, cross
functional IT and global/local IT teams
Required Qualifications:
- Bachelor or master's degree in information technology, Electrical Engineering or similar relevant fields.
- Proven experience (3 years minimum) with ETL development, design, performance tuning and optimization
- Very good knowledge of data warehouse architecture approaches and trends, and high interest to apply and further develop that knowledge, including understanding of
- Dimensional Modelling and ERD design approaches,
- Working Experience in Kubernetes and Docker Administration is added advantage
- Good experience in AWS Services, Big data, PySpark, EMR, Python, Cloud DB RedShift
- Proven experience with large, complex database projects in environments producing high-volume data,
- Proficiency in SQL and PL/SQL
- Experience in preparing data warehouse design artifacts based on given requirements (ETL framework design, data modeling, source-target-mapping),
- Experience in developing streaming applications e.g. SAP Data Intelligence, Spark Streaming, Flink, Storm, etc.
- Excellent conceptual abilities pared with very good technical documentation skills, e.g. ability to understand and document complex data flows as part of business / production processes, infrastructure.
- Familiarity with SDLC concepts and processes