Search by job, company or skills

Quantiphi

Senior Data Architect- Databricks

Save
new job description bg glownew job description bg glow
  • Posted an hour ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Role - Senior Technical Architect - Databricks

Location: Bangalore

Experience: 15+ years

As Senior Technical Architect - Data bricks at Quantiphi, you will solve enterprise data problems and will develop solutions for migration, storage, and processing

Required Skills:

  • Experience: 15+ years of relevant experience in building cloud-native, hybrid, or multi-cloud solutions including Dababricks and AWS.
  • Experience with Databricks implementations - developing data pipelines using PySpark
  • Experience in Databricks Workspaces,Notebooks, Delta Lake, APIs
  • Expertise in production grade solutions using: AWS (Redshift, S3, Glue, Lambda, Airflow), Pyspark, Python, Data pipelines using Glue/Airflow/Sagemaker.
  • Hands-on experience in working on large cloud-based migration workloads involving SQL and NoSQL Databases
  • Experience with migration of databases to AWS. Strong ability in creating roadmaps and architecture to execute migration workloads on AWS.
  • Exposure to ETL tools and Data warehouse.
  • Experience in SQL and Query optimisation. Proficient in SQL-based technologies (MySQL, Oracle DB,SQL Server etc.)
  • Experience in creation and maintenance of data dictionaries, metadata repositories, and data lineage documentation.
  • Experience building and supporting large-scale systems in a production environment.
  • Strong skills to mentor and manage teams of junior and senior data engineers and leading end to end delivery of technical workloads.
  • Experience using Github/codecommit for developmental activities
  • Experience designing and optimizing orchestration frameworks using Apache Airflow / Amazon MWAA including DAG optimization, dependency management, retries, monitoring, and operationalization.

Good to have skills:

  • Experience with AWS services - S3,Redshift,Secrets Manager
  • Exposure to SageMaker Unified Studio (SMUS) and modern business catalog/governance implementations for metadata management, data discovery, and governed analytics access.
  • Experience in implementing data integration projects using ETL
  • Experience in using Airflow or Step Functions for orchestration
  • Exposure to IaC tools like Terraform and to CI/CD tools
  • Prior experience in migrating (on-prem to cloud) and processing large amounts of data
  • Experience in implementing data lake and data warehouse on cloud
  • Experience in implementing the industry's best practices

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147513451