Search by job, company or skills

A

Data Platform Engineer

5-7 Years
Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 7 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Project Role : Data Platform Engineer

Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models.

Must have skills : Microsoft Azure Databricks

Good to have skills : NA

Minimum 5 Year(s) Of Experience Is Required

Educational Qualification : 15 years full time education

Summary:

As a Data Platform Engineer, a typical day involves contributing to the development and refinement of the data platform blueprint and design, ensuring that all relevant components are effectively incorporated. This role requires close collaboration with Integration Architects and Data Architects to maintain seamless integration between various systems and data models. The position demands active engagement in aligning platform strategies with organizational goals, facilitating smooth data flow and interoperability across multiple teams and technologies. The engineer plays a key role in supporting the overall architecture and ensuring that the platform meets evolving business and technical requirements.

Roles & Responsibilities:

  • Expected to be an SME, collaborate and manage the team to perform.
  • Responsible for team decisions.
  • Engage with multiple teams and contribute on key decisions.
  • Provide solutions to problems for their immediate team and across multiple teams.
  • Lead efforts to optimize data platform performance and scalability.
  • Mentor junior team members to enhance their technical capabilities and understanding.
  • Coordinate cross-functional initiatives to align data platform development with enterprise objectives.
  • Build and operate scalable Lakehouse pipelines on Databricks/Azure. Own ELT/streaming, Delta Lake optimization, Unity Catalog governance, and CI/CD. Integrate ADLS/ADF/Synapse, and deliver high-quality data sets for BI/ML/GenAI.
  • Must-haves: PySpark, SQL, Databricks (Delta, DLT/Workflows), Azure data services, Unity Catalog, CI/CD.

MLflow/Feature Store, Power BI, streaming/CDC, vector search/RAG, Terraform.

Professional & Technical Skills:

  • Must To Have Skills: Proficiency in Microsoft Azure Databricks.
  • Experience with distributed computing frameworks and big data processing.
  • Strong knowledge of cloud-based data storage and management solutions.
  • Familiarity with data integration techniques and ETL processes.
  • Ability to troubleshoot and resolve complex data platform issues.
  • Understanding of data security and governance principles within cloud environments.

Additional Information:

  • The candidate should have minimum 5 years of experience in Microsoft Azure Databricks.
  • This position is based at our Pune office.
  • A 15 years full time education is required.




More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147140651