Search by job, company or skills

Ujjivan Small Finance Bank

Lead-Date Engineer_VP-I

new job description bg glownew job description bg glownew job description bg svg
  • Posted a day ago
  • Be among the first 10 applicants
Early Applicant

Job Description

JOB DESCRIPTION

The Lead-Data Engineer will be responsible for architecting, building, and governing enterprise-scale data pipelines and platforms for Ujjivan Small Finance Bank. The role ensures secure, high-quality, reliable, and timely data availability to support analytics, regulatory reporting, risk management, and AI/ML initiatives.

This role provides technical and people leadership, defines data engineering standards, and acts as a key interface between business, analytics, governance, and technology teams.

KEY RESPONSIBILITIES OF THE ROLE

  • Design and own end-to-end data pipeline architecture across batch and near real-time processing aligned to enterprise strategy.
  • Define and govern bronze, silver, and gold data layer architecture for enterprise consumption.
  • Enable analytics, ML, and AI use cases by delivering model-ready and feature-ready datasets that drive business outcomes.
  • Optimize data pipeline performance and cost efficiency.
  • Establish CI/CD pipelines for data engineering, including version control, testing, and controlled deployments.
  • Contribute to planning, budgeting, and prioritization of data engineering initiatives aligned to business goals.
  • Collaborate with business, analytics, and risk teams to translate requirements into scalable data solutions.
  • Lead ingestion of data from Core Banking, LOS, LMS, Collections, CRM, Payments, Finance, and external data sources to support internal and external consumers.
  • Enable timely, reliable, and high-quality data availability for stakeholders across the organization.
  • Partner with Data Quality & Governance teams to operationalize Critical Data Elements (CDEs), lineage, and metadata for stakeholder trust and usability.
  1. Internal Process
  • Ensure pipeline scalability, fault tolerance, restart ability, and SLA adherence.
  • Implement workflow orchestration, dependency management, backfills, and automated retries.
  • Embed automated data quality checks, reconciliation controls, and anomaly detection.
  • Ensure secure data handling, including masking, encryption, and role-based access control.
  • Ensure compliance with regulatory, audit, and information security requirements.
  • Comply with internal SLAs, policies, and standard operating procedures.
  • Drive process management and continuous process excellence across data engineering workflows

MINIMUM REQUIREMENTS OF KNOWLEDGE & SKILLS

Educational

Qualifications

  • Bachelor's or Master's degree in engineering, Computer Science, or related field

Experience Range (Years and Core Experience Type)

  • 12-15 years of experience in data engineering or large-scale data platform development.
  • Proven experience in banking or financial services data environments.
  • Demonstrated experience leading teams and enterprise data programs.

Certifications

  • NA/ Good to have

Functional Skills

  • Advanced SQL and strong programming skills in Python / Scala and pyspark.
  • Deep understanding of Cloud architecture and Devops
  • Strong experience with ETL/ELT frameworks and distributed data processing.
  • Hands-on experience with data orchestration and scheduling frameworks.
  • Deep understanding of data warehousing, data lakes, and layered data architectures.
  • Expertise in data quality, reconciliation, metadata management, and data lineage.
  • Strong knowledge of CI/CD, version control (Git), and automated testing for data pipelines.
  • Experience with data security, masking, encryption, and role-based access control.
  • Exposure to streaming or near real-time data processing is desirable.
  • Understanding of ML/AI data requirements and feature engineering pipelines

More Info

Job Type:
Industry:
Employment Type:

Job ID: 142134783