Search by job, company or skills

KPMG India

Assistant Manager

new job description bg glownew job description bg glownew job description bg svg
  • Posted 23 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Description

Roles & responsibilities

Role Overview

The Senior Associate 2 (SA2) Azure Data Engineer will be part of the GDC Technology Solutions (GTS) team, contributing to the Audit Data & Analytics domain. This role involves developing proficiency in KPMG proprietary Data & Analytics tools and applying audit methodology to deliver impactful solutions.

The ideal candidate should possess strong expertise in data engineering, transformation, and integration using Azure technologies, playing a critical role in enabling audit engagements through data-driven insights. As a leader of a medium-sized team, the SA2 will drive the delivery of impactful, data-driven solutions for audit engagements, leveraging advanced Azure technologies.

As part of this team, the candidate will be responsible for extracting and processing datasets from client ERP systems (such as SAP, Oracle, Microsoft Dynamics and others) and other sources. The goal is to provide actionable insights through data transformation, ETL processes, and dashboarding solutions for audit and internal teams. Additionally, the role includes developing innovative solutions using a diverse set of tools and technologies.

Key Responsibilities

  • Team Leadership & Collaboration
  • Lead, mentor, and manage a team of 46 data engineers, ensuring high performance and professional development.
  • Foster a collaborative and inclusive team environment, promoting knowledge sharing and best practices.
  • Maintain accurate project status for self and team, ensuring timely delivery of milestones.
  • Coach junior team members and facilitate knowledge transfer for engagements of varying complexity.
  • Technical Delivery
  • Perform and oversee data transformation and integration projects using Azure Databricks, Azure Data Factory, and Python/PySpark.
  • Design and manage scalable data processing pipelines and workflows to support analytics and audit requirements.
  • Apply advanced concepts such as partitioning, optimization, and performance tuning for efficient data processing.
  • Integrate Databricks with ERP or third-party systems via APIs, developing robust business transformation logic.
  • Debug, optimize, and resolve issues in large-scale data workflows, proposing effective solutions with minimal guidance.
  • Data Handling & Workflow Development
  • Demonstrate advanced knowledge of Azure cloud services, including Azure Databricks, Data Factory, Data Lake Storage, and related technologies.
  • Utilize tools such as Alteryx and PowerBI for data analysis and visualization.
  • Apply understanding of audit processes, financial data structures, and risk assessment routines.
  • Explore and apply Azure AI services to enhance business processes.
  • Performance Optimization & Debugging
  • Debug, optimize, and tune performance for large-scale data processing workflows.
  • Resolve issues with minimal guidance and propose effective solutions.
  • Stakeholder Engagement
  • Collaborate with audit engagement teams and client IT teams to extract, transform, and interpret data.
  • Deliver meaningful audit insights through reports and dashboards, ensuring accuracy and attention to detail.
  • Prepare and review supporting documentation for audit engagements.
  • Domain Expertise (Added advantage)
  • Experience with General Ledger/Sub-Ledger analysis and development of risk assessment or substantive routines for Audit/Internal Audit (Big 4 experience preferred).
  • Additional Skills
  • Working knowledge of KQL (Kusto Query Language) and Azure REST APIs is a plus.
  • Enthusiasm to explore and apply Azure AI services in business processes.

Responsibilities


Roles & responsibilities

Qualifications

Education Requirements

  • B. Tech/B.E/MCA (Computer Science / Information Technology)

Ob Requirements


Technical Skills

  • Minimum 68 years of experience in data engineering, with deep expertise in Azure technologies.
  • Proficiency in Azure, DataBricks Notebooks, SQL, Python or Pyspark notebooks development.
  • Knowledge on any ERP and about the dataflows in ERP
  • Strong knowledge of ETL tools and processes.
  • Hands-on experience with Azure Databricks, Azure Data Factory (ADF), Azure Data Lake Storage (ADLS)
  • Comprehensive knowledge of Azure cloud services.
  • Experience with Databricks notebooks for building transformations and creating tables
  • Experience in Alteryx and PowerBI for data analysis and visualization purpose.
  • Microsoft Fabric and Azure AI services is an added advantage
  • Understanding of audit processes, financial data structures, and risk assessment routines.

Enabling Skills


  • Excellent analytical, problem solving and troubleshooting abilities
  • Critical thinking: able to look at numbers, trends and data and come to new conclusions based on findings
  • Attention to detail and good team player
  • Quick learning ability and adaptability
  • Willingness and capability to deliver within tight timelines
  • Effective communication skills
  • Flexible to work timings and willingness to work in different projects/technologies
  • Collaborate with business stakeholders to understand data requirements and deliver solutions

Leadership Qualities


  • Demonstrates integrity, accountability, and a commitment to team success.
  • Inspires and motivates team members, fostering a culture of continuous learning and improvement.

Acts as a role model for collaboration, inclusion, and professional growth.

#KGS



More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 135673225

Similar Jobs