Search by job, company or skills

  • Posted 7 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Wolters Kluweris a global leader in professional information services, providing software and content solutions for legal, business, tax, accounting, finance, audit, risk, compliance, and healthcare professionals. Serving clients in over 180 countries with operations in 40+, the company helps professionals make confident decisions and drive results. Headquartered in Alphen aan den Rijn, the Netherlands, it employs around 21,000 people worldwide.

The Global Finance Shared Services (FSS) is a finance function designed to bring together finance and accounting professionals to streamline and enhance processes and tools. Within FSS, the Finance Centre of Excellence (CoE) in Pune, India, provides centralized financial and analytical support across all Wolters Kluwer divisions. The team, comprising finance professionals and data scientists, focuses on finance reporting harmonization, technology enablement, and reporting and analytics enhancements.

In this role, you will be responsible for developing and maintaining data pipelines and ensuring data availability. Tools used will be mainly Snowflake and Informatica.

JOB QUALIFICATIONS

Education : Bachelor's degree in data science/analytics or Engineering in Computer Science, or related quantitative field.

Experience : At least 5 years of experience in the field of Data Analytics/Engineering.

Technical Skills :

  • Strong SQL (Oracle SQL mandatory).
  • PySpark for ETL transformation and data processing.
  • Experience with Lakehouse architecture & Delta technology.
  • Experience with ingestion, scheduling, and orchestration frameworks (Fabric pipelines, ADF, Airflow, etc.).
  • Knowledge of data modelling, jobs scheduling on server
  • Shell scripting
  • Power BI understanding is a plus but not mandatory.

Soft Skills :

  • Strong analytical skills capable of multi-tasking in fast-paced, dynamic environment
  • Strong written and verbal communication, including report writing and data storytelling
  • Stay updated with industry trends and evolving tools Demonstrate a proactive approach to learning new techniques and technologies
  • Work effectively across cross-functional teams
  • Provide support to junior team members and actively contribute to the achievement of team objectives.

ESSENTIAL DUTIES

Legacy Systems Understanding & Analysis:

  • Analyze existing shell/bash scripts, Oracle SQL procedures, and legacy ETL jobs.
  • Reverse-engineer and document current data workflows across ERP and downstream systems.
  • Identify inefficiencies, technical debt, and modernization opportunities.

Data Architecture & Modernization

  • Design target-state data architecture using Fabric Lakehouses, Delta tables, and PySpark-based pipelines.
  • Migrate legacy processes to Fabric notebooks, Dataflows, Pipelines, or other Fabric-native capabilities.
  • Implement scalable, resilient, and cost-efficient data ingestion, transformation, and orchestration patterns.

ERP to Datamart Data Engineering

  • Build endtoend data flows from ERP system tables (generally accessible by SQL tools) to standardized curated datamarts.
  • Optimize transformation logic, ensure referential integrity, and enforce data modeling best practices.
  • Collaborate with Finance COE stakeholders to understand KPIs, business logic, and reporting structures.

Data Quality, Governance & Performance

  • Implement data validation, reconciliation, and monitoring frameworks.
  • Optimize SQL/PySpark performance for large datasets.
  • Ensure adherence to enterprise data governance and security standards.

Collaboration & Delivery

  • Work closely with Finance teams, Data Architects, BI developers, and business stakeholders.
  • Provide technical guidance on modernization approaches and best practices.
  • Document new architecture, pipelines, and operational procedures.

OTHER DUTIES

Enhance Reporting & Analytics Standards:

  • Collaborate with analytics and BI teams to establish dashboard design standards, improve user experience, and ensure consistent KPI definitions across the organization.

Promote a Data-Driven Culture:

  • Advocate for a data-driven culture, promoting the value of analytics in strategic and operational decision-making.
  • Lead or support analytical training sessions or workshops for non-technical stakeholders to improve data literacy across the organization.

Cross-Functional Collaboration:

  • Partner with business, product, and operational teams to provide expert data engineering support, enabling effective data-driven solutions and innovation across departments.

Our Interview Practices

More Info

Job Type:
Function:
Employment Type:

About Company

Wolters Kluwer N.V. (Euronext Amsterdam: WKL ) is a Dutch information services company.The company is headquartered in Alphen aan den Rijn, Netherlands (Global) and Philadelphia, United States (corporate).Wolters Kluwer in its current form was founded in 1987 with a merger between Kluwer Publishers and Wolters Samsom.The company serves legal, business, tax, accounting, finance, audit, risk, compliance, and healthcare markets.It operates in over 150 countries.

Job ID: 144940885

Similar Jobs