Project Role : Data Architect
Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration.
Must have skills : Microsoft Azure Databricks
Good to have skills : NA
Minimum 5 Year(s) Of Experience Is Required
Educational Qualification : 15 years full time education
Summary
As a Data Architect, a typical day involves defining the data requirements and designing the structure necessary for the application. This role includes modeling the data architecture, planning how data will be stored efficiently, and ensuring seamless integration across various components. The position requires a thoughtful approach to organizing data to support application functionality and scalability, collaborating with different stakeholders to align data strategies with project goals, and continuously refining data models to meet evolving needs.
Key Responsibilities
- Function as the Lead Data Architect for a small, simple project/proposal or as a team lead for medium/large sized project or proposal
- Discuss specific Big data architecture and related issues with client architect/team (in area of expertise)
- Worked in implementation of Databricks Gen AI/Agentic AI use case
- Knowledge in LLM and Prompt engineering, AI foundry
- Candidate should have worked in Data governance Solution
- Analyze and assess the impact of the requirements on the data and its lifecycle
- Lead Big data architecture and design medium-big Cloud based, Data and Analytical Solutions using Lambda architecture.
- Breadth of experience in various client scenarios and situations
- Experienced in Big Data Architecture-based sales and delivery
- Thought leadership and innovation
- Lead creation of new data assets & offerings
- Experience in handling OLTP and OLAP data workloads
Technical Experience
- Experience working in Medallion architecture involving Delta lake house principles
- Expert level in Designing and Architect solutions in Azure Databricks, Azure Data lake, Delta Lake implementation.
- Experience in Databricks GenAI Implementation
- Experience in Azure purview/Profisee/Unity Catalog
- Well versed in Real time and batch streaming concepts and experience in its implementation
- Expert level experience in Azure cloud technologies like PySpark, Databricks, Python, Scala and SQL.
- Exp in one or more Real-time/batch ingestion including: Azure Delta live tables , Autoloader
- Exp in handling medium to large Big Data implementations
- Strong understanding of data strategy. Data Quality and Delta lake components
- For Level 8 - Candidate must have 10-12 years of IT experience and around 5 years of extensive Big data experience (design + build) in Databricks
- For Level 9 - Candidate must have 7-10 years of IT experience and around 5 years of Big data experience (design + build) in Databricks
- Architect for a medium sized client delivery project
Professional Experience
- Should be able to drive the technology design meetings, propose technology design and architecture
- Should have excellent client communication skills
- Should have good analytical and problem-solving skills
Educational Qualification
- Must have: BE/BTech/MCA
- Good to have: ME/MTech