Search by job, company or skills

FNZ

Senior Data Architect

8-10 Years
Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted a day ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Title: Data Architect — Analytical Warehouse (FNZ)

About FNZ:

FNZ is a global fintech firm transforming the way financial institutions serve their clients. By

combining cutting-edge technology, infrastructure, and investment operations, FNZ

enables wealth management firms to deliver personalized investment solutions at scale.

Operating across multiple regions and supporting over $1.5 trillion in assets under

administration, FNZ partners with leading banks, insurers, and asset managers to create

seamless and innovative wealth platforms that empower millions of investors worldwide.

Job Summary:

We are seeking a senior Data Architect to design and own the architecture of the Analytical

Warehouse built on Microsoft Fabric. This role is responsible for defining the data models,

storage strategies, ingestion patterns, semantic layer, and governance framework that

transform the NRT-ODS Gold-layer streaming data into a structured, performant, and

governed analytical platform. You will architect the bridge between real-time streaming and

historical analytics, serving both operational BI and client-facing reporting workloads.

Key Responsibilities:

  • Analytical Warehouse Architecture: Design the end-to-end architecture for the

Analytical Warehouse on Microsoft Fabric — ingestion from ODS Gold topics,

Bronze/Silver/Gold layering within OneLake, transformation pipelines, semantic

layer, and consumption endpoints.

  • Data Modelling: Define dimensional models, star schemas, and wide denormalized

tables optimized for analytical query patterns. Design fact and dimension tables for

wealth management domains — accounts, portfolios, transactions, positions, fees,

NAV, AUM.

  • Ingestion Architecture: Architect the Kafka-to-Fabric ingestion pipeline — Kafka

Connect sink configuration, Avro-to-Delta schema mapping, partitioning strategy

(date, entity type, client), exactly-once delivery semantics, and error handling.

  • Lakehouse Strategy: Define the OneLake storage architecture including

namespace design, table format strategy (Delta Lake near-term, Apache Iceberg

long-term), partition evolution, file compaction policies, and retention

management.

  • Semantic Layer Design: Architect the semantic layer that provides businessfriendly metrics (AUM, NAV, trade volumes, fee breakdowns) with consistent

definitions across dashboards, reports, APIs, and client portals.

  • Data Sharing Architecture: Design the architecture for Fabric Data Sharing —

OneLake shortcuts and Delta Sharing protocols that enable clients to consume

analytics in their own Fabric tenants with governed, client-scoped access.

  • Data Governance & Contracts: Extend the ODS data contracts framework into the

Analytical Warehouse. Define governance policies for the analytical layer including

data classification, access controls (Purview), lineage tracking, and audit trails.

  • Batch Extract Migration: Architect the migration of batch extract from SQL-driven

CSV to Kafka-sourced Parquet/Delta via Fabric pipelines. Design the metadatadriven configuration that preserves CentralHub flexibility.

  • Performance Architecture: Design for query performance — Z-ordering strategies,

partition pruning, materialized views, caching layers, and compute resource

allocation across Fabric workspaces.

  • Apache Iceberg Roadmap: Plan the long-term migration to Apache Iceberg on

OneLake for time-travel queries, partition evolution, and multi-engine access

(Fabric, Spark, Trino, Flink). Evaluate Confluent Tableflow or custom sink for Kafkato-Iceberg pipeline.

  • Standards & Governance: Establish naming conventions, modelling standards,

documentation requirements, and code review processes for all Analytical

Warehouse development. Conduct architecture reviews for Data Engineer

deliverables.

Qualifications:

  • Education: Bachelor's or Master's degree in Computer Science, Engineering, Data

Science, or a related technical field.

  • Experience: 8+ years of experience in data architecture or data engineering, with at

least 3 years in a data architect role on analytical/warehouse platforms.

  • Microsoft Fabric / Azure: Deep experience with Microsoft Fabric, Azure Synapse

Analytics, or equivalent cloud analytical platforms. Strong understanding of

OneLake, Fabric lakehouse, and Fabric SQL endpoints.

  • Data Modelling: Expert-level skills in dimensional modelling (Kimball), data vault,

and denormalized modelling for analytical workloads. Experience modelling

financial services data domains.

  • Delta Lake / Iceberg: Strong understanding of modern table formats — Delta Lake

(ACID transactions, time travel, schema evolution) and Apache Iceberg (partition

evolution, multi-engine support).

  • SQL Expertise: Advanced SQL skills for analytical queries, performance tuning, and

query plan analysis.

  • Streaming-to-Analytical Bridge: Experience architecting data pipelines that bridge

real-time streaming platforms (Kafka) with analytical warehouses/lakehouses.

  • Semantic Layers: Experience with semantic layer and data transformation tools for

defining governed business metrics.

  • Data Governance: Experience with data governance frameworks, data catalogs

(Purview, Atlan), and access control policies in multi-tenant environments.

Preferred Qualifications:

  • Experience working in the Wealth Management or Financial Services industry with

deep understanding of investment operations data models.

  • Experience with Apache Kafka — consumer architecture, Kafka Connect, Avro

schema evolution, and schema registries.

  • Familiarity with SQL-based transformation frameworks for managing

transformation layers (models, tests, documentation, CI/CD).

  • Experience with data quality frameworks (Great Expectations, Soda) integrated

into analytical pipelines.

  • Experience architecting multi-tenant analytical platforms with client-scoped data

isolation.

  • Knowledge of privacy-preserving analytics — differential privacy, confidential

compute, or federated analytics patterns.

  • Microsoft Fabric certifications, Azure Data Engineer (DP-203), or Azure Solutions

Architect certifications are a plus.

About FNZ

FNZ is committed to opening up wealth so that everyone, everywhere can invest in their future on their terms. We know the foundation to do that already exists in the wealth management industry, but complexity holds firms back.

We created wealth's growth platform to help. We provide a global, end-to-end wealth management platform that integrates modern technology with business and investment operations. All in a regulated financial institution.

We partner with the world's leading financial institutions, with over US$2.4 trillion in assets on platform (AoP).

Together with our clients, we empower nearly 30 million people across all wealth segments to invest in their future.

More Info

Job Type:
Industry:
Function:
Employment Type:

About Company

Job ID: 146021201

Similar Jobs