
Search by job, company or skills
Company Description
DataMoo AI, a Chennai-based start-up and the AI Unit of Matvik Solutions Pvt Ltd, specializes in developing innovative products in the field of Artificial Intelligence. The company collaborates with highly skilled doctorates and industry experts to deliver cutting-edge solutions in areas such as conversational AI, natural language processing, cloud data analytics, machine learning, and image and video analytics.
This is a high-visibility, front-facing role with frequent interaction with Investments, Investor Relations, financial planning and analysis, and Accounting. This role is partly stakeholder-facing but primarily focused on designing data models, writing data transformation code, and operating pipelines in Azure Databricks.
Role and ResponsibilitiesData Engineering and Pipelines
oBuild and maintain end-to-end data pipelines in Azure Databricks:
Upstream data ingestion from Yardi, Salesforce, Argus, and other systems.
Transformation and enrichment logic using Delta Lake, Structured Query Language, and (where needed) Python.
Downstream data delivery into curated tables, data marts, and business intelligence layers.
oImplement job orchestration, scheduling, and monitoring to ensure reliable daily and monthly runs.
Data Modeling and Data MartsoDesign and implement logical and physical data models for:
Investment performance, investor and partner views, and allocation/billing related to corporate financial planning and analysis.
oCreate and maintain data marts and reporting-ready layers optimized for consumption by Domo and other tools.
oApply dimensional modeling and best practices for keys, hierarchies, and slowly changing dimensions.
Performance, Quality, and ArchitectureoOptimize queries, tables, and pipelines for performance and cost (cluster configuration, partitioning, indexing, caching where appropriate).
oImplement data quality checks, validations, and alerts; work with business analysts and business intelligence developers to resolve issues.
oContribute to the design of the overall data architecture, including bronze/silver/gold patterns, naming standards, and governance.
Collaboration and SupportoWork closely with the Product Owner / Senior Business Analyst to align data models to business definitions and the glossary.
oProvide curated datasets and views to Business Intelligence Developers for dashboards and reports.
oSupport system and user acceptance testing by investigating data issues and providing fixes and explanations.
Skills and ExperienceCore Skills
oStrong data engineering skills:
Building ingestion, transformation, and delivery pipelines.
Writing and maintaining production-grade data transformation code.
oStrong data modeling skills (dimensional and relational).
oStrong Structured Query Language skills, including tuning and optimization.
oAbility to design scalable, maintainable data architectures that serve multiple reporting and analytics needs.
oProblem-solving mindset with focus on reliability, performance, and data quality.
ToolsoHands-on experience with Azure Databricks, including:
Delta Lake tables, notebooks, jobs, and cluster management.
oStrong Structured Query Language; Python experience is a plus but not required.
oExperience with Unity Catalog or similar tools for data governance and lineage.
oExperience with Databricks Genai in creating agents for extracting and processing data/information from structured and unstructured data
oUse of Git or similar for source control and code management.
oExposure to Domo or other business intelligence tools is helpful for understanding downstream needs.
ExperienceoTypically 48 or more years in data engineering or data modeling roles.
oProven experience building and running production data pipelines, data marts, and data warehouses or data lakes.
oPrior work with financial, investment, or property data is strongly preferred.
oBackground in real estate, private equity, asset management, or financial services is a strong advantage.
Drop your CV to [Confidential Information]
Job ID: 135076683