Skills Required:-
1. Excellent understanding of enterprise data warehousing principles and ETL/ELT solutions
- Flexible and able to adjust to changing priorities in a fast-paced environment.
- Experience working on or with API's for data delivery/ingestion, e.g., REST APIs, GraphQL, etc.
- Experience working on or with batch-based data delivery/ingestion paradigms, e.g., csv files, ODBC connections, etc.
- Experience in Code management tools (GitHub) Build and Deployment tool (TravisCI).
- Excellent knowledge and working experience on data management technologies and frameworks, preferably on Python, Airflow, AWS API Gateway, SNS, SQS, Lambda
- Good understanding of Lambda and Kappa Architecture patterns
- Ability to work with large and complex data sets, data lakes and data warehouses
- SQL query skills. Hands-on experience with AWS Redshift, PostgreSQL and other database technologies like MySQL, Microsoft SQL Server, etc. and ability to write complex queries to extract/inject data into those databases.
- Proven ability to manage a team of developers and to work in an objective oriented environment