Key Responsibilities:
- Design, develop, and maintain data pipelines using Python, SQL, and Kedro
- Implement serverless solutions using AWS Lambda and Step Functions
- Develop and manage data workflows in Azure and AWS cloud environments
- Create integrations between data systems and Power Platform (Power Apps, Power Automate)
- Design, develop, and maintain APIs for data exchange and integration
- Implement solutions to extract data from APIs and store in databases (Dataverse & PostgreSQL).
- Optimize data storage and retrieval processes for improved performance
- Collaborate with cross-functional teams to understand data requirements and provide solutions
- API Integration and data extraction from Sharepoint
Required Skills and Experience:
- Expert in Azure Data Engineer role, able to build solutions in Power Platform environment, AWS cloud and Azure cloud services.
- Good Knowledge on integration like Integration using API, XML with ADF, Logic App,
- Azure Functions, AWS Step functions, AWS Lambda Functions, etc
- Strong proficiency in Python, Pyspark (especially Kedro Framework) and SQL.
- Having knowledge and experience in Scala, Java and R would be a good to have skill.
- Experience with Kedro framework for data engineering pipelines
- Expertise in AWS services, particularly Lambda and Step Functions
- Proven experience with Power Platform (Power Apps, Power Automate) and Dataverse.
- Strong understanding of API development and integration
- Experience in database design and management
- Excellent problem-solving and communication skills