Search by job, company or skills

KPMG India

Assistant Manager - SA2 (AI Hub - GDC)

new job description bg glownew job description bg glownew job description bg svg
  • Posted 22 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

JOB DESCRIPTION

Roles & responsibilities

  • Design, develop, and implement data pipelines for scalable solutions using various technologies Databricks or Microsoft Fabric
  • Conduct Code reviews and recommend best coding practices
  • Provide effort estimates to implement solutions, development effort for the proposed solutions
  • Develop and maintain comprehensive documentation for all proposed solutions. models, including detailed technical specifications, results of testing and evaluation, and datasets used for developing the solutions
  • Lead Architect and design efforts for Product development and application development for relevant use cases providing guidance and support to team members and clients
  • Implement best practices of Data Engineering and Architectural solution design, development, Testing and documentation
  • Participate in team meetings, brainstorming sessions, and project planning activities
  • Stay up-to-date with the latest advancements in Data Engineering area, to drive innovation and maintain a competitive edge
  • Stay hands-on with the design, development and validation of systems and models deployed
  • Collaborate with audit professionals to understand business, regulatory and risk requirements and key alignment considerations for Audit
  • Drive efforts in the Data Engineering and Architecture practice area

Mandatory technical & functional skills

  • Databricks/Fabric/Spark Notebooks
  • SQL/NoSQL databases, Redis,
  • Full-Stack Development - React, Angular
  • Data Management: Design, implement, and manage AI-driven data solutions on the Microsoft Azure cloud platform, ensuring scalability and performance.
  • Data Integration: Develop and maintain data pipelines for AI applications, ensuring efficient data extraction, transformation, and loading (ETL) processes using Azure Data Factory.
  • Big Data Processing: Utilize big data technologies such as Azure Databricks and Apache Spark to handle, analyze, and process large datasets for machine learning and AI applications.
  • machine learning frameworks -TensorFlow, PyTorch
  • Backend frameworks - FastAPI, Django

Preferred Technical & Functional Skills


  • Experience working with frameworks LangChain/LlamaIndex /LlamaPrase/LlamaCloud/Semantic Kernel etc.
  • Develop real-time data ingestion and stream-analytic solutions leveraging technologies such as Kafka, Apache Spark (SQL, Scala, Java), Python and Hadoop Platform and any Cloud Data Platform.
  • Certifications: Relevant certifications such as Microsoft Certified: AI 102, DP 700, DP 900 or AWS certiications

Key behavioral attributes/requirements

  • Strong analytical, problem-solving, and critical-thinking skills
  • Excellent collaboration skills, with the ability to work effectively in a team-oriented environment
  • Excellent written and verbal communication skills, with the ability to present complex technical concepts to non-technical audiences
  • Willingness to learn new technologies and work on them

#KGS

RESPONSIBILITIES

Roles & responsibilities

  • Design, develop, and implement data pipelines for scalable solutions using various technologies Databricks or Microsoft Fabric
  • Conduct Code reviews and recommend best coding practices
  • Provide effort estimates to implement solutions, development effort for the proposed solutions
  • Develop and maintain comprehensive documentation for all proposed solutions. models, including detailed technical specifications, results of testing and evaluation, and datasets used for developing the solutions
  • Lead Architect and design efforts for Product development and application development for relevant use cases providing guidance and support to team members and clients
  • Implement best practices of Data Engineering and Architectural solution design, development, Testing and documentation
  • Participate in team meetings, brainstorming sessions, and project planning activities
  • Stay up-to-date with the latest advancements in Data Engineering area, to drive innovation and maintain a competitive edge
  • Stay hands-on with the design, development and validation of systems and models deployed
  • Collaborate with audit professionals to understand business, regulatory and risk requirements and key alignment considerations for Audit
  • Drive efforts in the Data Engineering and Architecture practice area

Mandatory technical & functional skills

  • Databricks/Fabric/Spark Notebooks
  • SQL/NoSQL databases, Redis,
  • Full-Stack Development - React, Angular
  • Data Management: Design, implement, and manage AI-driven data solutions on the Microsoft Azure cloud platform, ensuring scalability and performance.
  • Data Integration: Develop and maintain data pipelines for AI applications, ensuring efficient data extraction, transformation, and loading (ETL) processes using Azure Data Factory.
  • Big Data Processing: Utilize big data technologies such as Azure Databricks and Apache Spark to handle, analyze, and process large datasets for machine learning and AI applications.
  • machine learning frameworks -TensorFlow, PyTorch
  • Backend frameworks - FastAPI, Django

Preferred Technical & Functional Skills


  • Experience working with frameworks LangChain/LlamaIndex /LlamaPrase/LlamaCloud/Semantic Kernel etc.
  • Develop real-time data ingestion and stream-analytic solutions leveraging technologies such as Kafka, Apache Spark (SQL, Scala, Java), Python and Hadoop Platform and any Cloud Data Platform.
  • Certifications: Relevant certifications such as Microsoft Certified: AI 102, DP 700, DP 900 or AWS certiications

Key behavioral attributes/requirements

  • Strong analytical, problem-solving, and critical-thinking skills
  • Excellent collaboration skills, with the ability to work effectively in a team-oriented environment
  • Excellent written and verbal communication skills, with the ability to present complex technical concepts to non-technical audiences
  • Willingness to learn new technologies and work on them

#KGS

QUALIFICATIONS

This role is for you if you have the below

Educational Qualifications

  • Minimum qualification required: BTech in Computer Science/ MTech/ MCA - Fulltime education.

Work Experience


  • 6 to 8 years of experience in design, develop data centric applications using various tools and technologies eg: Databases,reporting,ETL/ELT,NoSQL etc.
  • 4+ years of experience in designing, architecting solutions using Microsoft Data technologies like ADF/SYNAPSE
  • Relevant Data Professional certifications AWS, GCP or Azure













More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 132339947

Similar Jobs