Location Name: Pune Corporate Office - Mantri
Job Purpose
To design, implement, and manage real-time data replication solutions using Oracle GoldenGate. The role involves ensuring high availability, performance, and reliability of data integration processes across heterogeneous systems while supporting business-critical data movement and operational continuity.
Duties And Responsibilities
Key Roles –
- GoldenGate Implementation: Install, configure, and manage Oracle GoldenGate (19c and above) for real-time data replication.
- Database Expertise: Work extensively with Oracle Database architecture and advanced SQL.
- Heterogeneous Replication: Implement and manage replication across multiple source and target systems such as Oracle, SQL Server, and Azure-based data platforms.
- Automation & Monitoring: Develop shell scripts for automation, monitoring, and alerting of GoldenGate processes.
- Performance Optimization: Ensure replication efficiency, low latency, and high throughput while minimizing downtime.
- Cloud Integration: Support integration with cloud platforms such as Azure (Event Hub, Data Lake, Databricks, etc.).
Key Responsibilities –
- Design, implement and maintain Oracle GoldenGate replication solutions for various use cases including real-time and batch processing.
- Perform installation, configuration, upgrades, and troubleshooting of Oracle GoldenGate environments.
- Monitor replication processes and resolve issues such as lag, abends, and data inconsistencies.
- Implement best practices to minimize downtime and ensure high availability of data pipelines.
- Analyze and recommend infrastructure and database changes to optimize replication performance.
- Develop automation scripts (Shell/Python) for operational efficiency and proactive monitoring.
- Build reusable frameworks and patterns to enable rapid onboarding of new data sources or tenants.
- Own and drive the technical roadmap for data ingestion and replication strategies.
- Collaborate with cross-functional teams including DBAs, cloud teams, and application teams.
- Prepare and maintain operational runbooks, SLA documents, and technical documentation.
Required Qualifications And Experience
- Azure Databricks - PySpark, SQL - Must Have
- Azure Data Factory – For ETL & Data Integrations - Must Have
- OOPS Concept Implementation in C#/.Net
- COSMOS Database for NoSQL DB
- Event Hub & Kafka for Change Feed & Real-Time Streaming - Good to Have
- Azure Data Explorer as Time Series Database with Kusto Query Language (KQL) As Programming Language - Good to Have
Soft Skills
- Strong problem-solving and analytical thinking
- Good communication and stakeholder management
- Ability to work independently and as part of a team in a fast-paced environment
- Strong attention to detail and data integrity