
Search by job, company or skills
Position Overview: Lead the end-to-end design, development, and optimisation of enterprise data platforms while driving the adoption of generative AI solutions across the organisation. This role combines deep expertise in modern data engineering, cloud-based architectures, and advanced AI/LLM technologies to build scalable, secure, and intelligent data-driven systems.
1. Data Engineering & Architecture
Design and implement robust data pipelines using modern tools (e.g., Spark, Databricks, Kafka, Airflow).
Build and maintain scalable ETL/ELT frameworks for structured, semi-structured, and unstructured data.
Own the architecture of data lakes, data warehouses (Snowflake/BigQuery/Redshift), and real-time streaming systems.
Optimise data models for analytics, ML workloads, and business intelligence use cases.
Ensure high data quality, governance, lineage, security, and compliance with organisational standards.
2. Generative AI & Machine Learning
Develop, fine-tune, and deploy LLM-based solutions (e.g., GPT, Llama, Claude models) for business automation, insights generation, and decision support.
Build retrieval-augmented generation (RAG) architectures and vector databases (Pinecone, FAISS, Chroma).
Create custom AI agents for internal workflowscredit analysis, due diligence, underwriting support, customer interactions, etc.
Lead experimentation with multimodal AI (text, image, document intelligence).
Collaborate with business teams to convert functional problems into scalable AI-driven solutions.
3. Platform & Infrastructure
Deploy and manage AI and data workloads on cloud platforms (AWS / Azure / GCP).
Implement MLOps & LLMOps pipelines for CI/CD, automated testing, and monitoring of AI models.
Integrate AI systems with enterprise applications and APIs for seamless workflow automation.
Evaluate, adopt, and integrate new AI tools, frameworks, and model providers.
4. Stakeholder & Cross-Functional Collaboration
Work closely with analytics, business, product, and engineering teams to deliver impactful AI and data initiatives.
Translate complex technical concepts into clear business insights.
Mentor junior engineers and contribute to building an internal AI/Data capability.
Desired Qualifications & Experience: Bachelor's degree or equivalent with 612 years of experience in Data Engineering with at least 12 years in Generative AI/LLM technologies. Proven experience architecting scalable data platforms and deploying AI models into production.
Background in sectors like BFSI, real estate, technology, consulting, or enterprise preferred
Strong command of Python, SQL, PySpark, Scala (optional).
Expertise in data pipeline orchestration: Airflow, Prefect, DBT. Experience with cloud-native data architectures: AWS/GCP/Azure.
Hands-on with LLMs, vector databases, RAG pipelines, embeddings, model fine-tuning.
Familiarity with containerisation and orchestration (Docker, Kubernetes).
Understanding of MLOps tools: MLflow, Vertex AI, SageMaker, Weights & Biases
Job ID: 136364587