Job Description
About KPMG in India
KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada.
KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focussed and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment.
The person will work on a variety of projects in a highly collaborative, fast-paced environment. The person will be responsible for software development activities of KPMG, India. Part of the development team, he/she will work on the full life cycle of the process, develop code and unit testing. He/she will work closely with Technical Architect, Business Analyst, user interaction designers, and other software engineers to develop new product offerings and improve existing ones. Additionally, the person will ensure that all development practices are in compliance with KPMG's best practices policies and procedures. This role requires quick ramp up on new technologies whenever required.
Responsibilities
Role : GCP Data Engineer
Location: Hyderabad
Experience: 15 to 20 years
- Architect, design, and implement largescale data pipelines and ETL/ELT workflows using GCP services such as BigQuery, Cloud Composer (Airflow), and Dataflow.
- Lead cloudnative data platform modernization initiatives and define best practices for data ingestion, transformation, and governance.
- Build and optimize batch and streaming data solutions using Spark, Kafka, and Dataflow pipelines.
- Develop reusable frameworks, data models, and transformation layers using DBT or equivalent tools.
- Ensure data quality, schema design, lineage tracking, and performance optimization.
- Collaborate with cross-functional teams including Data Architects, Analysts, Product, and Engineering leadership.
- Mentor junior engineers and provide technical guidance to the broader data engineering team.
- Lead reviews on architecture, performance tuning, data modeling, and design decisions.
- Implement security best practices across data pipelines, including IAM, encryption, and compliance controls.
- Work with DevOps teams to establish CI/CD workflows and infrastructure automation.
Mandatory Technical Skills
- BigQuery (query optimization, partitioning, clustering, data modeling)
- Cloud Composer (Airflow) for orchestration
- Dataflow for batch and streaming pipelines
- Strong exposure to GCS, Cloud Functions, Pub/Sub, IAM
- Python (data processing, automation, ETL frameworks)
- Java (pipeline logic, distributed systems)
- SQL (advanced analytical SQL, query tuning)
- Shell scripting for automation and workflow integration
- Spark (PySpark or Java/Scala)
- DBT (Data Build Tool) or similar transformation tools
- Snowflake (data modeling, performance tuning)
- Kafka (streaming ingestion, consumer/producer design)
Qualifications
Bachelor's or master's degree in computer science, Information Technology, or a related field.
Equal Opportunity Employer
KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you.