Search by job, company or skills

Exponentia.ai

Data Architect

new job description bg glownew job description bg glownew job description bg svg
  • Posted 21 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

About Exponentia.ai

Exponentia.ai is a fast-growing AI-first technology services company, partnering with enterprises to shape and accelerate their journey to AI maturity. With a presence across the US, UK, UAE, India, and Singapore, we bring together deep domain knowledge, cloud-scale engineering, and cutting-edge artificial intelligence to help our clients transform into agile, insight-driven organizations.

We are proud partners with global technology leaders such as Databricks, Microsoft, AWS, and Qlik, and have been consistently recognized for innovation, delivery excellence, and trusted advisories.

Awards & Recognitions

  • Innovation Partner of the Year Databricks, 2024
  • Digital Impact Award, UK 2024 (TMT Sector)
  • Rising Star APJ Databricks Partner Awards 2023
  • Qlik's Most Enabled Partner APAC

With a team of 450+ AI engineers, data scientists, and consultants, we are on a mission to redefine how work is done, by combining human intelligence with AI agents to deliver exponential outcomes.

Learn more: www.exponentia.ai

About The Role

We are looking for a highly skilled Data Architect with hands-on experience in modern cloud-based data platforms and strong working knowledge of Databricks. The candidate will architect scalable data ecosystems, design end-to-end data pipelines, and establish data standards to support advanced analytics, BI, and AI initiatives.

Key Responsibilities

Data Architecture & Platform Design

  • Design and implement scalable enterprise data architectures across cloud environments.
  • Develop conceptual, logical, and physical data models for analytical and operational use cases.
  • Define data ingestion, transformation, and integration patterns using Databricks, Delta Lake, and related frameworks.
  • Architect ELT/ETL pipelines leveraging Databricks Workflows, Delta Live Tables, or orchestration tools.

Databricks & Lakehouse Responsibilities

  • Develop and optimize data pipelines on Databricks (SQL, Python, PySpark).
  • Implement Lakehouse architecture principles using Delta Lake, Unity Catalog, and Databricks compute clusters.
  • Optimize Spark jobs, cluster configurations, and cost/performance strategies.
  • Work with Databricks features such as feature store, MLflow, Delta Sharing, and workspace governance.

Data Governance & Quality

  • Define data quality rules, lineage, metadata standards, and governance frameworks.
  • Collaborate with security teams to ensure compliance with data privacy and security requirements.
  • Implement governance structures using Unity Catalog, RBAC, and data access policies.

Cross-functional Collaboration

  • Partner with data engineers, analysts, AI/ML teams, and business stakeholders to deliver data-driven solutions.
  • Translate business needs into scalable, secure, and efficient data architectures.
  • Provide architectural guidance and best practices around Databricks and cloud data systems.

Strategy & Innovation

  • Evaluate data technologies and recommend tooling aligned with modernization and scalability goals.
  • Drive cloud migration and transformation initiatives, including legacy system modernization.
  • Contribute to the long-term enterprise data architecture roadmap.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 135881831