Key Responsibilities:
- Data Pipeline Design: Responsible for designing and developing ETL data pipelines that can help in organising large volumes of data. Use of data warehousing technologies to ensure that the data warehouse is efficient, scalable, and secure.
- Issue Management: Responsible for ensuring that the data warehouse is running smoothly. Monitor system performance, diagnose and troubleshoot issues, and make necessary changes to optimize system performance.
- Collaboration: Collaborate with cross-functional teams to implement upgrades, migrations and continuous improvements.
- Data Integration and Processing: Responsible for processing, cleaning, and integrating large data sets from various sources to ensure that the data is accurate, complete, and consistent.
- Data Modelling: Responsible for designing and implementing data modelling solutions to ensure that the organization's data is properly structured and organized for analysis.
Key Qualifications & Skills:
- Education Qualification: B.E./B. Tech. in Computer Science, Information Technology or equivalent domain with 10 to 12 years of experience and at least 5 years or relevant work experience in Datawarehouse/mining/BI/MIS.
- Experience in Data Warehousing: Knowledge on ETL and data technologies like OLTP, OLAP (Oracle / MSSQL). Data Modelling, Data Analysis and Visualization experience (Analytical tools experience like Power BI / SAS / ClickView / Tableau etc.). Good to have exposure to Azure Cloud Data platform services like COSMOS, Azure Data Lake, Azure Synapse, and Azure Data factory.
- Certification: Azure certified DP 900, PL 300, DP 203 or any other Data platform/Data Analyst certifications.