Search by job, company or skills

cosmicfusionlabs private limited

Databricks Architect

8-12 Years
Save
new job description bg glownew job description bg glow
  • Posted 4 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Engagement: Contract, 12 months

Location: Remote

Working Hours: 8 hours shift IST day time

Compensation: 1,40,000/month 

Key Responsibilities and Core Competencies:

• You will be responsible for architecting solutions primarily based on Databricks (AWS/Azure or other cloud platform) for different pharma clients.

• Responsible for client interaction and act as a thought partner; understanding the business requirements, suggesting the best features of the tool, development, quality assurance by following the best practices.

• Collaborative way of working with clients, onshore and offshore teams.

• Leading a team by providing them guidance, resolving their technical and business-related problems etc.

• Provide expert-level advice and technical mentorship on best practices for data pipelines, data lakes, and analytics in the Databricks environment.

• Optimize cloud-based applications to improve efficiency, reduce efforts/cost, and increase business value.

What You Bring:

• 8–12 years of expertise in architecting, designing, and implementing scalable, secure, and high-performance data platforms using Azure and Databricks.

• Go-to expert on Azure Data Services and Databricks platform architecture.

• Support DevOps, CI/CD, and monitoring practices for data workloads.

• Design and implement POCs of modern data architecture patterns (Lakehouse, Delta Lake, medallion architecture).

• Good communication to lead meetings with client architecture and platform teams.

• Design real-time and batch processing pipelines (including API integration).

• Setting up self-serve layer for different user personas.

• Able to learn and perform quick POCs on Databricks new offerings.

• Design ETL/ELT and data lake applications.

• Good pharma domain knowledge, mainly for commercial use cases.

Preferred Skills:

• Azure/AWS, Databricks, PySpark, SQL

• Data Architecture, ETL/ELT, No-SQL database

• Data Pipeline Orchestration (Airflow/Step Functions), CI/CD and DevOps tools

• Team and client management experience

• Project delivery expertise

• Excellent communication and analytical problem-solving skills• Knowledge of AI tools and models is a plus. 

If match the above requirements please share your resume at [Confidential Information] or DM directly

More Info

Job Type:
Industry:
Employment Type:

Job ID: 147485291

Similar Jobs

India

Skills:

PysparkSqlELTData ArchitectureDatabricksAzureDevops ToolsAWSEtlAirflowSteps FunctionsCI CDData Pipeline OrchestrationNo-SQL database

Pune, India

Skills:

Spark SQLDistributed ComputingPysparkData ModelingData WarehousingELTGitGcpDatabricksAzureEtlAWSCI CDUnity CatalogDelta LakeStructured Streaming

Hyderabad, India

Skills:

GithubTerraformPysparkDatabricksPythonSqlAzure DevOpsMosaic AIDelta LakeMLflowUnity Catalog

Chennai

Skills:

snowflake Databricks

Noida

Skills:

DatabricksPythonScalaSqlAzureAWS