Search by job, company or skills

INFINITO

Data Architect

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 21 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

About the Role

We are hiring a Data Architect — not a senior data engineer. This role is about designing data platforms and solutions, not building pipelines day-to-day. You will spend a significant part of your time in front of clients: understanding their business, framing the problem, shaping the target architecture, and guiding the delivery team that builds it.

Our current delivery stack is Microsoft Fabric, and you should be fluent enough in it to make credible architecture and design decisions. But the role is stack-agnostic at its core. We are looking for architects who have designed data solutions across multiple platforms over their career, and who can evaluate trade-offs between tools rather than defaulting to one. Fabric is what we deliver on today; tomorrow's engagement may demand a different choice, and we want architects who think that way.

The architecture craft we are hiring for is a clear progression from business pain to conceptual model to usable analytics: starting with a precise problem statement, resting on a well-reasoned data model with clear transformation logic and built-in data quality, and ending with analytics-ready outputs that decision-makers can actually use.

To avoid confusion with senior engineering roles, here is how we draw the line:

https://drive.google.com/file/d/1tU7zmekDXN71j3qlpmgYZXRwW5c1esYF/viewusp=sharing

A candidate whose CV is primarily a list of pipelines built on a single stack , however senior, is likely a strong data engineer, not the architect we are hiring.

Key Responsibilities

Client Engagement & Solution Shaping

  • Engage directly with client stakeholders Engage directly with client business and technology stakeholders to understand their domain, operating model, pain points, and strategic goals.
  • Translate ambiguous business problems Translate ambiguous business problems into clear architecture requirements, success metrics, and phased delivery plans.
  • Run discovery workshops Run discovery workshops, present architecture proposals, and defend design choices in front of both technical and non-technical audiences.
  • Act as the trusted data advisor Act as the trusted data advisor across the engagement lifecycle — pre-sales, solutioning, delivery oversight, and post-go-live optimisation.

Architecture & Data Modelling

  • Own the target-state architecture Own the target-state architecture: conceptual, logical, and physical designs for data platforms, including integration patterns and consumption layers.
  • Design data models Design data models appropriate to the problem — dimensional (Kimball), Data Vault, or domain-driven — and justify the choice.
  • Define reference architectures Define reference architectures and architecture decision records (ADRs); establish patterns that multiple delivery teams can reuse.
  • Evaluate tooling and platform trade-offs Evaluate tooling and platform trade-offs; recommend the right fit for the client's maturity, scale, and budget rather than defaulting to a single stack.

Delivery on Microsoft Fabric

  • Shape solutions on Microsoft Fabric Shape solutions on Microsoft Fabric — OneLake, Lakehouses, Warehouses, Data Factory, Dataflows Gen2, Spark notebooks, Real-Time Intelligence, and Power BI semantic models — at a design and oversight level.
  • Define the medallion architecture Define the medallion architecture (Bronze / Silver / Gold), data contracts, and Delta Lake / partitioning strategies for each engagement.
  • Guide the delivery team Guide the delivery team on capacity sizing (F-SKUs), workload isolation, and cost / performance optimisation. You will review and direct, not write every pipeline.

Governance, Security & Quality

  • Architect secure, governed platforms Architect secure, governed platforms using Microsoft Purview, workspace roles, row-level and object-level security, sensitivity labels, and lineage.
  • Define data quality frameworks Define data quality frameworks, testing strategies, and observability approaches that scale across domains.
  • Ensure compliance Ensure compliance with relevant standards (GDPR, HIPAA, DPDP, or industry-specific) as part of every design.

Capability Building

  • Mentor data engineers, analysts, and junior architects Mentor data engineers, analysts, and junior architects; raise the design quality of the broader practice.
  • Contribute to internal accelerators Contribute to internal accelerators, reference architectures, and practice-level IP.
  • Support pre-sales Support pre-sales — RFP responses, solution estimates, and PoCs — as part of the Capability function.

Required Qualifications

  • Up to 14 years of total experience Up to 14 years of total experience, with the last several years spent in an architecture or lead-solutioning capacity — not purely in hands-on engineering.
  • Track record across more than one platform Demonstrable track record of designing end-to-end data solutions across more than one platform over your career (for example, combinations of Azure Synapse, Databricks, Snowflake, AWS Redshift, GCP BigQuery, Teradata, or on-prem warehouses).
  • Working proficiency in Microsoft Fabric Working proficiency in Microsoft Fabric at an architecture and design level. Strong Azure data stack experience (Synapse, ADF, Databricks, Power BI) with active Fabric adoption is acceptable.
  • Strong data modelling credentials Strong data modelling credentials — able to defend dimensional, Data Vault, and normalised designs and pick the right one for the context.
  • Fluent in SQL and comfortable reading PySpark / Python Fluent in SQL and comfortable reading PySpark / Python; you will review and challenge code, even if you are not the primary author on most engagements.
  • Strong grounding in data governance Strong grounding in data governance, security, lineage, and compliance.
  • Excellent client-facing communication Excellent client-facing communication — able to lead workshops, present to CXOs, and produce crisp architecture documents in English.
  • Willingness to work on-site Willingness to work on-site from our Gurgaon office and travel to client locations as required.

Good to Have

  • Microsoft certifications: DP-600, DP-700, or broader architecture certifications (Azure Solutions Architect, TOGAF, DAMA-CDMP).
  • Experience with dbt, Delta Lake internals, and open table formats (Iceberg, Hudi).
  • Domain depth in one of: healthcare / life sciences, BFSI, retail & CPG, or manufacturing.
  • Exposure to open-source or community-driven data models and standards relevant to your domain (for example, OMOP or FHIR in healthcare).
  • Pre-sales experience — responding to RFPs, building PoCs, and estimating data programmes.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147184949

Similar Jobs

Delhi, India

Skills:

google guice snowflake HldElkLldCassandraKafkaSpring BootGrafanaHTMLJavascriptDockerPhpPythonAWSJavaBigQueryHadoopCSSScalaNode.JSRedshiftSqlRedisJenkinsGcpAnsibleSparkMongoDBAzureKubernetesGolangChefTiDB

Noida, India

Skills:

JavaS3Aws LambdaCloudformationScalaAWS GlueKafkaSqlKinesisTerraformSparkPython

Gurugram

Skills:

Data Architecture & DesignGcpData MigrationBigQuerySqlcdc

Gurugram, Gurugram, India

Skills:

JavaAws LambdaS3CloudformationScalaAWS GlueSqlTerraformAmazon RedshiftSparkPythonAWS cloud-native technologies

Hyderabad, Noida

Skills:

CloudSqlBigQueryGcpData ArchitectureETL/ELT