Search by job, company or skills

Optimum Solutions Pte Ltd

Principal Streaming Data Architect for our banking client in Chennai

12-14 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted 6 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

  • 12+ years of experience delivering data-intensive solutions (data technical architect and engineering).
  • Streaming and event driven data experience building and utilizing Kafka or queue-based architectures including ordering, replay, schema evolution, and real time observability.
  • Strong hands-on expertise in Apache Kafka ecosystem including Kafka Connect, Kafka Streams (KStreams), ksqlDB, schema registry, producer/consumer services using Java.
  • Strong understanding of transactional integrity, idempotency, and error handling in financial data flows. Hands-on experience building Java/Spring Bootbased data processing services that produce and consume Kafka events, apply realtime transformations and enrichment, and persist channeloptimized views into MongoDB to support lowlatency omnichannel banking use cases.
  • Experience designing replay safe and resilient consumers for banking workloads.
  • Proven experience implementing and architecting Change Data Capture (CDC) data from heterogeneous sources (RDBMS, NoSQL, SaaS) using tools like Debezium, Goldengate, or native connectors.
  • Solid understanding of MongoDB, Schema design & document modeling, Indexing, sharding techniques and performance tuning. Good to have experience in Azure cosmos for mongo db vCore architecture.
  • Hands-on experience designing MongoDB as a realtime, readoptimized data serving layer, decoupled from systems of record, for mobile and omnichannel banking use cases.
  • Performance tuning for lowlatency mobile workloads.
  • Hands-on expertise in architecting omnichannel and Customer 360 data platforms using MongoDB or similar NoSQL databases. Experience building eventdriven materialized views in MongoDB to support mobile apps, web.
  • Strong knowledge of stream and batch processing frameworks such as Spark, Flink, and cloud-native/on-prem data processing services.
  • Expertise in modern data architecture including data mesh, and unified data platforms across onprem and cloud (Azure/AWS/GCP).
  • Strong knowledge of APIled and eventdriven architectures working together (REST + async events). Ability to define contracts, versioning, backward compatibility, and domain events aligned to banking domains.
  • Hands-on experience building Java / Spring Boot services that consume Kafka events and persist realtime, replaysafe updates into MongoDB.
  • Strong understanding of APIdriven architectures and how they integrate with streaming and event platforms.

More Info

Job Type:
Industry:
Employment Type:

Job ID: 144625335