
Search by job, company or skills
Engagement: Contract, 12 months
Location: Remote
Working Hours: 8 hours shift IST day time
Compensation: 1,40,000/month
Key Responsibilities and Core Competencies:
• You will be responsible for architecting solutions primarily based on Databricks (AWS/Azure or other cloud platform) for different pharma clients.
• Responsible for client interaction and act as a thought partner; understanding the business requirements, suggesting the best features of the tool, development, quality assurance by following the best practices.
• Collaborative way of working with clients, onshore and offshore teams.
• Leading a team by providing them guidance, resolving their technical and business-related problems etc.
• Provide expert-level advice and technical mentorship on best practices for data pipelines, data lakes, and analytics in the Databricks environment.
• Optimize cloud-based applications to improve efficiency, reduce efforts/cost, and increase business value.
What You Bring:
• 8–12 years of expertise in architecting, designing, and implementing scalable, secure, and high-performance data platforms using Azure and Databricks.
• Go-to expert on Azure Data Services and Databricks platform architecture.
• Support DevOps, CI/CD, and monitoring practices for data workloads.
• Design and implement POCs of modern data architecture patterns (Lakehouse, Delta Lake, medallion architecture).
• Good communication to lead meetings with client architecture and platform teams.
• Design real-time and batch processing pipelines (including API integration).
• Setting up self-serve layer for different user personas.
• Able to learn and perform quick POCs on Databricks new offerings.
• Design ETL/ELT and data lake applications.
• Good pharma domain knowledge, mainly for commercial use cases.
Preferred Skills:
• Azure/AWS, Databricks, PySpark, SQL
• Data Architecture, ETL/ELT, No-SQL database
• Data Pipeline Orchestration (Airflow/Step Functions), CI/CD and DevOps tools
• Team and client management experience
• Project delivery expertise
• Excellent communication and analytical problem-solving skills• Knowledge of AI tools and models is a plus.
If match the above requirements please share your resume at [Confidential Information] or DM directly
Job ID: 147485291
Skills:
Pyspark, Sql, ELT, Data Architecture, Databricks, Azure, Devops Tools, AWS, Etl, Airflow, Steps Functions, CI CD, Data Pipeline Orchestration, No-SQL database
Skills:
Spark SQL, Distributed Computing, Pyspark, Data Modeling, Data Warehousing, ELT, Git, Gcp, Databricks, Azure, Etl, AWS, CI CD, Unity Catalog, Delta Lake, Structured Streaming
Skills:
Github, Terraform, Pyspark, Databricks, Python, Sql, Azure DevOps, Mosaic AI, Delta Lake, MLflow, Unity Catalog
Skills:
snowflake , Databricks
Skills:
Databricks, Python, Scala, Sql, Azure, AWS
We don’t charge any money for job offers