Search by job, company or skills

  • Posted a day ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Description:

What you'll be doing:

  • Lead the design and implementation of scalable, secure, and high-performing data solutions using Azure Databricks, Delta Lake, and Delta Live Tables.
  • Own the complete architecture process, including requirements gathering, solution design, documentation, technical reviews and implement advanced data solutionson the Databricks platform.
  • Define best practices for cloud data architecture, data modeling, ELT/ETL pipelines, workspace setup, cluster management, repos, and job orchestration.
  • Coordinate and communicate with onshore and offshore teams, including end-users, data engineers, reporting specialists, and business analysts.
  • Ensure solution compliance with data privacy, security, and governance standards.
  • Conduct performance tuning and optimization of Databricks clusters.
  • Ensure data quality, lineage, and observability across all pipelines.
  • Monitor and troubleshoot data pipelines to ensure data quality and reliability.
  • Lead the integration of Databricks with other data platforms and tools.
  • Create processes and workflows to support data solutions documents, and lead solution reviews and audits for quality
  • Stay updated with industry best practices, Databricks and Azure developments, and introduce improvements to the technical stack.

What we're looking for:

  • At least 10 years of experience in enterprise data architecture, including design and implementation of large-scale data platforms and 5+ years of relevant data engineering experience on Databricks.
  • Strong expertise in: Databricks (Workspace, Clusters, Jobs, Repos, Delta Live Tables)
  • Deep hands-on expertise with Azure Databricks, PySpark, Delta Lake, Unity catalog, MLflow, DBT, and associated Azure data services (Data Lake, SQL, Synapse, ADF).
  • Proven experience migrating from legacy data warehouse and reporting systems to modern cloud platforms.
  • Experience in data modeling, data warehousing (OLTP, OLAP), security, governance, DevOps, and MLOps.
  • Hands-on with CI/CD, version control (Git), and DevOps practices for data engineering
  • Excellent communication, collaboration, problem-solving and analytical skills with the ability to collaborate effectively with cross-functional teams and influence decision-making at all levels of the organization.
  • Architect-level certifications in Databricks and DBT are preferred.
  • Ability to mentor and coach junior architects, engineers, and BI/reporting developers.

More Info

Job Type:
Employment Type:

About Company

Arrow.com is your resource for electronic component products, datasheets, reference designs and technology news. Explore Arrow.com today.

Job ID: 138418863