Search by job, company or skills

Bajaj Finserv

Senior Database Developer

new job description bg glownew job description bg glownew job description bg svg
  • Posted 8 days ago
  • Be among the first 20 applicants
Early Applicant

Job Description

Location Name: NR Trident Tech Park

Job Purpose

The Senior Database Developer will design, build, and optimize the data backbone of the Engineering CRM platform. This role focuses on architecting PostgreSQL databases, building robust data pipelines using Azure Data Factory and Databricks, and ensuring seamless two way integration with HRMS and other systems. The purpose of this role is to ensure high-quality data availability, reliability, and performance across CRM modules, enabling smooth workflows for all internal users.

Duties And Responsibilities

  • Database Design & Optimization (PostgreSQL)
  • Design and maintain scalable, normalized database schemas for the Engineering CRM.
  • Develop efficient SQL queries, views, stored procedures, and functions.
  • Optimize indexes, partitioning, and query execution plans for high performance.
  • Manage versioning, migrations, and schema changes across environments.
  • Ensure data security, access controls, and role-based permissions at DB level.
  • Data Pipeline Development (Azure Data Factory & Databricks)
  • Create and manage ETL/ELT pipelines using Azure Data Factory (ADF).
  • Develop transformation logic, notebooks, and processing workflows using Azure Databricks (PySpark/SQL).
  • Set up data ingestion from APIs, files, databases, and internal systems.
  • Ensure pipelines are resilient, fault tolerant, and optimized for performance & cost.
  • System Integrations (Employee Master / HRMS Sync)
  • Build and maintain two way data sync processes between CRM and HRMS/employee master systems.
  • Implement delta sync logic to avoid redundant or duplicate data movement.
  • Ensure accurate mapping of employee attributes, hierarchies, organizational structures, etc.
  • Collaborate with application, HR, and infra teams to resolve discrepancies and maintain master data integrity.
  • Data Quality, Governance & Monitoring
  • Implement validation rules, data quality checks, and cleansing procedures.
  • Set up monitoring, alerts, and logging for pipelines using Azure tools.
  • Conduct root-cause analysis for integration or data quality issues.
  • Maintain data dictionaries, lineage documentation, and metadata repositories.
  • Collaboration & Delivery
  • Work closely with API developers, PMO/BA, QA, UI/UX, and product teams.
  • Convert feature requirements into DB design and data architecture.
  • Support testing cycles with test data creation, query validation, and data fixes.
  • Participate in sprint ceremonies and assist in estimates and technical planning.

Required Qualifications And Experience


  • Qualifications

Bachelor's degree in computer science, Information Technology, or a related field

  • Work Experience

Minimum 3+ years of hands-on experience as a Database Developer or Data Engineer.

  • Skills Keywords

Work Experience Requirements


  • 3+ years of hands-on experience as a Database Developer or Data Engineer.
  • Practical experience with PostgreSQL in production environments.
  • Strong exposure to Azure Data Factory and Databricks.
  • Experience integrating with HRMS, employee master systems, or similar enterprise data sources (preferred).
  • Experience working in Agile/Scrum setups and CI/CD-based deployments.
  • Knowledge of internal platforms, CRMs, or enterprise workflow tools (good advantage).

Core Database Skills

  • Strong hands-on experience with PostgreSQL (required).
  • Expertise in:

oAdvanced SQL optimization

oWindow functions, CTEs

oTriggers, functions, stored procedures

oIndexing strategies, vacuum/analyze tuning

  • Knowledge of partitioning, performance troubleshooting, and DB health monitoring.

Azure Data Engineering Skills

  • Hands-on experience with Azure Data Factory (pipelines, datasets, linked services).
  • Good experience with Databricks (PySpark or Spark SQL).
  • Experience with ingestion from REST APIs, file systems, blob storage, and databases.
  • Familiarity with Azure App Service, Key Vault, Blob Storage (as used in your project).

Integration & ETL/ELT Understanding

  • Data mapping, transformations, cleansing, and enrichment logic.
  • Experience in enterprise master-data sync workflows (Employee, HRMS, Org Structure).
  • Familiarity with JSON/XML, flat files, delta loads, and incremental pipeline design.

General Technical Skills

  • Understanding of backend application behavior (preferably .NET Core APIs).
  • Ability to debug issues across layers (API ETL DB HRMS integration).
  • Version control (Git), CI/CD exposure, SQL profiling tools.














More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 144827301