ABOUT UNILEVER:
Every individual here can bring their purpose to life through their work. Join us and you ll be surrounded by inspiring leaders and supportive peers. Among them, you ll channel your purpose, bring fresh ideas to the table, and simply be you. As you work to make a real impact on the business and the world, we ll work to help you become a better you.
Role Overview:
We are undertaking enterprise transformation program focused on simplifying and modernizing its core systems and data architecture. As part of this initiative, we are seeking a mid-levelData Analystto support our Data Analyst Lead in driving data standardization and integration efforts. In this role, you will help bridge the company s current data landscape (e. g. SAP ECC, SAP BW, existing data warehouses) with a new harmonized data model developed by global process owners. You will leverage your3-5 yearsof experience and strong data modeling skills to map legacy data to the new model, develop functional specifications, and ensure that data across key domains (Finance, Supply Chain, Master Data, etc. ) is accurately integrated. You will work closely with business process owners, data platform teams, and analytics teams, reporting to the Data Analyst Lead to ensure the new data model meets business needs and is successfully implemented.
ThisData Analystrole is critical in ensuring the organization s data is harmonized and ready for advanced analytics as part of the transformation program. The ideal candidate combines solid technical data modeling abilities with an understanding of business data needs in areas like Finance and Supply Chain. By collaborating with the Data Analyst Lead and various stakeholders, you will help build a robust data foundation that supports the company s future-state processes and decision-making. This position offers the opportunity to contribute to one of the organization s most significant transformation initiatives, with a direct impact on how data powers the business moving forward.
Key Responsibilities:
- Data Model Mapping Harmonization:Map existing data from current systems (SAP ECC, SAP BW, etc. ) to the newharmonized enterprise data model. Identify gaps or inconsistencies between old and new data structures, ensuring all key business metrics (KPIs) are accurately represented in the new model. Document data mapping and transformation rules for use by development and analytics teams, and provide feedback to refine the data model based on insights gathered during the mapping process.
- Functional Specification Development:Collaborate with the Data Analyst Lead to create detailedfunctional specificationsfor building the new unified data layer. Define how source system fields and tables should be transformed, combined, or re-modeled to fit the harmonized design. Ensure that business rules (e. g. calculations, data derivations) are clearly captured in these specifications, providing a solid blueprint for data engineers and developers. Help maintain a data dictionary or glossary for the new data environment to facilitate a common understanding among stakeholders.
- Data Pipeline Design Operations:Design, develop, and support data pipelinestoextract, transform, and load (ETL/ELT)data into the new harmonized data platform. UtilizeAzure Data Factory (ADF)for orchestration (scheduling, workflow management) andAzure Databricks (Spark)for large-scale data transformation and processing. Ensure pipelines are well-structured and efficient - including proper scheduling, error/exception handling, and data validation steps. Monitor pipeline performance and data quality; troubleshoot issues (e. g. pipeline failures, data mismatches) and optimize for speed and reliability. While this is not a formal lead role, be prepared to guide data engineers by reviewing pipeline designs or code, sharing best practices, and coordinating efforts to implement fixes or improvements. These efforts will ensure high-quality data flows seamlessly from source systems to the new data platform.
- Testing Data Validation:Work with IT and testing teams during development andUser Acceptance Testing (UAT)to validate the new data model and pipelines. Design test scenarios and sample data (including defining golden records or reference data sets) to verify that transformed data in the new system matches expected results. Perform data reconciliation between legacy reports (e. g. SAP BW outputs) and the new harmonized data layer to ensure accuracy and completeness. When discrepancies or data quality issues are discovered, analyze root causes (whether source data issues, mapping errors, or pipeline problems) and collaborate with the team to resolve them.
- Go-Live Support:Provide support during deployment to ensure all critical data issues are resolvedbefore go-live, so that business users can trust the new data platform from day one. If required, create relevant dashboards or monitoring solutions to validate data and support user adoption during the go-live phase. Be responsive to any data issues that arise post-launch and coordinate their prompt resolution.
- Data Governance Quality:Uphold the organization s data governance standards throughout the data transformation. Ensure consistent definitions and usage of data across the new model in line with enterprise data standards. Support the establishment of data quality checks and master data management practices in the new system - for example, ensuring pipelines include validation for key fields and that master data (such as product or customer information) remains consistent and de-duplicated.
- Documentation Data Lineage:Maintain thorough documentation of data mappings, transformation logic, and data lineage to provide transparency and support ongoing governance. Ensure that all changes and designs are well-documented, enabling future teams to understand how data flows have been constructed and how data is transformed throughout the pipeline. This documentation will be critical for maintenance, auditing, and knowledge transfer.
Required Skills Qualifications:
- Education Experience:Bachelor s or master s degree in computer science, Information Systems, Data Analytics, or a related field. Approximately5+ yearsof experience in data analysis, data modeling, or data management roles, including involvement in large-scale ERP or data transformation projects (e. g. ERP migration, data warehouse implementation). Experience in a large enterprise orFMCG (fast-moving consumer goods)company s data environment is a plus.
- Data Modeling Expertise:Proven ability indata modeling and database schema design. Capable of translating complex business requirements into conceptual and logical data models. Hands-on experience creating entity-relationship diagrams, data mapping documents, or data flow diagrams for enterprise-scale data projects. Familiarity with data modeling tools or notation (e. g. ERwin, UML) is beneficial.
- Data Engineering Tools:Strong hands-on experience with modern data engineering tools and techniques. Proficiency in building data pipelines using platforms such asAzure Data Factory(for ETL orchestration) andAzure Databricks(for data processing with Spark) is essential. Strong SQL skills and experience writing complex data transformation logic are required; experience with programming languages like Python or Scala for data engineering is a plus. Familiarity with big data performance optimization (tuning Spark jobs, handling large data volumes) is important. Comfortable setting up pipeline schedules, monitoring workflows, and troubleshooting failures in a cloud environment. Experience with other cloud data services or ETL tools (e. g. Azure Synapse, Google Cloud data pipelines, or similar) is a bonus.
- Domain Knowledge:Familiarity with key business domains and data inFinance and Supply Chain, as well as a solid understanding ofMaster Dataconcepts (e. g. products, customers, vendors). Ability to understand how data is used in financial reporting, supply chain planning, and related business processes. Prior exposure to SAP ECC/BW data structures or similar ERP systems is highly advantageous.
- Data Systems Platforms:Experience working in data warehousing or big data environments. Knowledge of enterprise data warehouse solutions such asSAP Business Warehouse (BW)or comparable systems - understanding how ERP data is extracted and utilized for reporting and analytics. Exposure to modern cloud-based data platforms and tools (e. g. Microsoft Azure data services, Google Cloud Platform s data tools, etc. ) for moving and transforming data is desired. Proficiency in SQL and familiarity with business intelligence/analytics tools (e. g. Power BI, Tableau) are also useful.
- Analytical Problem-Solving Skills:Excellent analytical skills with strong attention to detail. Ability to compare current vs. future data structures, identify discrepancies, and propose solutions to reconcile differences. Comfortable working with large, complex datasets and performing root-cause analysis on data issues. A pragmatic problem-solver who can balance ideal solutions with practical constraints and timelines.
- Collaboration Communication:Strong communication and interpersonal skills. Proven ability to work effectively incross-functional teamsand serve as a liaison between technical teams (data engineers, IT) and business stakeholders. Capable of translating technical data details into business-friendly language. Should be comfortable participating in meetings or workshops to gather requirements and clearly documenting outcomes. Experience in coordinating with multiple stakeholders and driving consensus is valuable.
- Proactive Team-Oriented:Self-motivated with the ability to take ownership of assignments and drive them to completion, while also knowing when to seek guidance or escalate issues. Demonstrated ability to work collaboratively in a team environment, supporting colleagues and sharing knowledge. Adaptable and eager to learn new tools or concepts as the project evolves. A positive attitude towards change and continuous improvement is key.