AuxoAIpartners with leading enterprises to modernize their data ecosystems using cloud-native and AI-ready architectures. We are seeking an experiencedDremioData Architectto lead the design and optimization of our modern datalakehouseplatform enabling governed, high-performance analytics across structured and unstructured data.
Role Overview
You willbe responsible forarchitecting, implementing, andoptimizingDremio-based datalakehouseenvironmentsintegrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance ofarchitecture design, datamodeling, query optimization, and governance enablementin large-scale analytical environments.
Responsibilities:
- Design and implementDremiolakehousearchitectureon cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semanticmodelingstrategies to support analytics and AI workloads.
- OptimizeDremioreflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards forDremiodeployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeledaccess to enterprise data.
Requirements:
- Bachelor's orMaster's in Computer Science, Information Systems, or related field.
- 10+ yearsin data architecture and engineering, with3+ years inDremioor modernlakehouseplatforms.
- StrongexpertiseinSQL optimization, datamodeling, and performance tuningwithinDremioor similar query engines (Presto, Trino, Athena).
- Hands-on experience withcloud storage (S3, ADLS, GCS),Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge ofdata integration tools and pipelines(Airflow, DBT, Kafka, Spark, etc.).
- Familiarity withenterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Preferred:
- Experience integratingDremiowith BI tools (Tableau, Power BI, Looker) and datacatalogs(Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, orBigQueryenvironments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.