Search by job, company or skills

Wroots Global Private Limited

Data Architect

12-18 Years
30.5 - 55 LPA
Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 10 hours ago
  • Be among the first 10 applicants
Early Applicant
Quick Apply

Job Description

Job Title: Senior Azure Data Engineer (Databricks & PySpark)

Experience: 12–18 Years

Location: Bangalore

Employment Type: Full-Time

Job Summary

We are looking for a highly skilled Senior Azure Data Engineer with strong expertise in Azure Databricks, PySpark, and Lakehouse architecture. The ideal candidate will have extensive experience in building scalable data pipelines, optimizing big data workloads, and implementing secure, governed data platforms on Azure.

Primary Skills (Must Have)

  • 6+ years of experience in Azure Databricks with PySpark
  • 6+ years of experience in Databricks Workflows & Unity Catalog
  • 5+ years of experience in Azure Data Factory (ADF)
  • 5+ years of experience in ADLS Gen2
  • 5+ years of experience in Azure SQL
  • 5+ years of experience in Azure Cloud Platform
  • 4+ years of experience in Python programming & packaging

Key Responsibilities

Data Engineering & Lakehouse Development

  • Design, develop, and maintain Lakehouse solutions using Azure Databricks
  • Build pipelines for batch and near real-time data processing
  • Integrate data from ERP systems, APIs, RDBMS, NoSQL, and on-prem sources

Databricks & Spark Optimization

  • Optimize Spark jobs using RDD/DataFrame APIs
  • Implement partitioning strategies & file format optimization (Parquet/Delta)
  • Tune Spark SQL performance and manage clusters, libraries, and runtime versions

Data Governance (Unity Catalog)

  • Manage data access, permissions, and lineage using Unity Catalog
  • Integrate with Azure AD, external metastores, and audit frameworks

Data Orchestration

  • Develop workflows using Azure Data Factory & Databricks Workflows
  • Build reusable pipelines with triggers, dependencies, and parameterization
  • Implement error handling and performance tuning strategies

Storage & Data Management

  • Work with ADLS Gen2 implementing bronze-silver-gold architecture
  • Manage RBAC, ACLs, lifecycle policies, and storage optimization

SQL & Metadata Management

  • Develop T-SQL queries, stored procedures
  • Maintain metadata layers on Azure SQL

Azure Ecosystem & DevOps

  • Work with VNets, Private Endpoints, Key Vaults, Managed Identity
  • Implement monitoring via Azure Monitor
  • Automate deployments using Azure DevOps, ARM, Bicep, or Terraform

Python Development

  • Write modular, reusable, and testable Python code
  • Manage environments using pip/Poetry/Conda
  • Implement unit testing using PyTest/unittest

Leadership & Collaboration

  • Lead solution design discussions
  • Prepare HLD/LLD, architecture diagrams
  • Mentor junior engineers and ensure coding best practices
  • Collaborate with business stakeholders, QA, and product teams

Soft Skills

  • Strong communication (verbal & written)
  • Good stakeholder management
  • Problem-solving and analytical thinking
  • Experience in Agile/Scrum (Jira/Azure DevOps)
  • Proactive and accountable work approach

Expected Outcome

  • Build scalable data engineering solutions on Azure
  • Deliver high-quality, optimized, and governed data pipelines
  • Collaborate effectively across teams and stakeholders
  • Follow best practices in DevOps and data architecture

Good to Have Skills

  • Experience with Azure Entra ID / Active Directory
  • Knowledge of GitHub Actions
  • Exposure to orchestration tools like Airflow, Dagster, Logic Apps
  • Experience with Kafka / Azure Event Hub (event-driven architecture)
  • Knowledge of Google Cloud Pub/Sub
  • Experience with CDC tools (Debezium)
  • Exposure to Azure Synapse & data migration projects
  • Experience with Google Cloud Storage

More Info

Job Type:
Function:
Employment Type:
Open to candidates from:
Indian

Job ID: 147247901

User Avatar
0 Active Jobs

Similar Jobs

Bengaluru, India

Skills:

PysparkAzure Data FactoryAzure Synapse AnalyticsAzure DatabricksPythonProfisee Unity CatalogMicrosoft PurviewDelta LakeAzure Blob StorageAzure SQL Database

Bengaluru, India

Skills:

data warehouses snowflake BigQueryHadoop EcosystemKafkaRedshiftSqlELTGcpSparkAzurePythonAWSEtlSynapsedata lakesstreaming pipelines

Bengaluru, India

Skills:

cloud securityApisPysparkKafkaSqlELTAzure Data LakeData GovernanceDatabricksPythonEtlcdcEvent HubSAP connectorsSynapseDelta Lakemetadata practices

Bengaluru, India

Skills:

snowflake GcpDB2AzureOraclePythonSqlAirflowdbtTeradata

Bengaluru

Skills:

snowflake Data ModellingPythonPysparkSqlSoftware DevelopmentBi Tools