Search by job, company or skills

Calpar Global

Data Architect (Cloud & Databricks)

new job description bg glownew job description bg glownew job description bg svg
  • Posted 2 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

We are seeking a highly experienced Data Architect to design and lead modern, scalable, and secure cloud data platforms that power analytics, reporting, and AI-driven decision-making for enterprise customers.

This is a senior, impactful role for someone who can combine deep technical expertise with strong stakeholder engagement shaping data strategy while guiding engineering teams to deliver robust, enterprise-grade solutions.

What you'll do:
  • Lead end-to-end architecture for cloud-based data ecosystems, including lakehouse and enterprise analytics platforms
  • Translate business needs into scalable architectural designs, technical roadmaps, and governance frameworks
  • Architect and implement Databricks solutions using Unity Catalog, Delta Lake, Databricks SQL, and Workflows
  • Define and enforce data modelling standards across relational, dimensional, and lakehouse architectures
  • Design and oversee ETL/ELT frameworks, metadata strategies, and reusable transformation standards
  • Establish best practices for data ingestion, quality, lineage, cataloging, and MDM (preferably Profisee)
  • Partner with engineering teams to ensure performance, security, and architectural consistency
  • Build cloud-native reference architectures using Azure services such as ADF, ADLS, Synapse, and Stream Analytics
  • Collaborate with executive stakeholders to define data governance, taxonomy, and metadata strategies
  • Mentor and guide junior engineers through design reviews and technical decision-making
Minimum Qualifications:
  • Bachelor's degree in Computer Science, Engineering, MIS, or related field
  • 12+ years total experience, with 3+ years in data architecture
  • 3+ years hands-on Databricks experience (Unity Catalog, Delta Lake, SQL, Workflows)
  • Strong expertise in Python, Apache Spark, and distributed data processing
  • Advanced SQL skills, including performance tuning and optimization
  • Proven experience designing lakehouse architectures and cloud data platforms
  • Hands-on experience with Azure Data Services (ADF, ADLS, Azure SQL, Synapse, Stream Analytics or Fabric)
  • Deep understanding of data modelling (3NF, Kimball, Inmon) and enterprise data warehousing
  • Prior consulting experience with enterprise clients
  • Familiarity with CI/CD and IaC tools (Terraform, ARM, Bicep)
Preferred Skills:
  • Experience building automated CI/CD pipelines and environment strategies
  • Exposure to Microsoft Fabric or other modern analytics platforms
  • Experience with Big Data technologies (HDFS, Hive, MapReduce)
  • Familiarity with NoSQL systems (Cassandra, MongoDB, HBase, CouchDB)
  • Experience with BI tools such as Power BI, Tableau, Qlik, or Cognos

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 139150443