Search by job, company or skills

A

AI Engineer

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 9 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

About Arohi

Founded in 2006, Arohi is an asset management company managing funds for global quality institutions. We have offices in Singapore and India. 

 

Role Overview

Build and maintain the data infrastructure and agent systems that underpin fundamental equity investment research - supporting the investment process from initial idea generation through deep-dive research to portfolio management. You will be joining a new team, with the role being hands-on across pipelines, agent workflows, data systems, MCP tools, and evaluation infrastructure. 

 

Key Responsibilities 

  • Data Pipeline Development: Build and maintain pipelines that ingest, clean, and structure financial data from multiple sources - market data, filings, earnings transcripts, macro feeds - handling the inconsistencies, latency differences, and structural complexity that come with real-world financial data at scale. 
  • RAG & Knowledge Graph Development: Build and maintain retrieval systems and knowledge graphs that give agents accurate, structured access to financial data. Continuously iterate on retrieval quality - improving how data is chunked, indexed, and surfaced - based on agent performance and input from investment analysts. 
  • Agent System Development: Build and implement multi-agent systems capable of carrying out complex, multi-step research workflows - coordinating specialist agents across retrieval, reasoning, and synthesis, and managing state and context across long-running runs. Work within the broader system architecture to ensure agents access tools and data correctly, and that outputs are structured for downstream use. 
  • Evaluation & Iteration: Instrument agent runs to capture structured output that supports systematic review and failure analysis. Identify failure patterns - retrieval errors, reasoning gaps, data quality issues - and iterate continuously. Work closely with investment analysts to understand where outputs are falling short, and incorporate their feedback into retrieval systems, tooling, and agent behaviour on an ongoing basis. 
  • Frontend & API Integration: Connect agent systems and data infrastructure to front-end applications via clean, reliable APIs. Wrap data sources, financial models, and external services into versioned tools that agents and downstream systems can depend on. 

 

What We're Looking For 

Someone with a genuine curiosity about how businesses work and how capital is allocated - and the drive to build systems that fundamentally change the depth and scale at which investment research can be done. We value candidates who have built LLM-based solutions as a core part of their practice and who are excited by the hard problem of making rigorous, large-scale investment research possible through AI. 

 

Requirements 

  • Experience: 3-6 years in data engineering, ML infrastructure, or backend systems, with recent hands-on experience building LLM agent systems in the context of financial or deep research applications. Experience working with equity research data - filings, earnings transcripts, pricing feeds, financial statements, and consensus estimates - is a strong plus. 
  • Education: Master's or PhD in Computer Science, Engineering, or a related quantitative field preferred. Strong portfolios of shipped projects or open-source contributions are equally welcome. 
  • Agent Systems: Hands-on experience with agentic frameworks (LangGraph, Google ADK, or equivalent). Practical understanding of prompt chaining, tool use, memory, and multi-agent orchestration. 
  • RAG & Knowledge Graphs: Experience building and iterating on retrieval-augmented generation systems and knowledge graphs. Able to diagnose how retrieval quality affects agent output. 
  • Evaluation: Experience building evaluation infrastructure for LLM systems - structured failure analysis and tight iteration cycles with domain experts. 
  • Engineering: Strong in Python. Comfortable with SQL, ETL pipelines and building APIs. Familiarity with AWS, Azure, or GCP is a plus. 

More Info

Job Type:
Industry:
Employment Type:

Job ID: 147254511

Similar Jobs

Navi Mumbai, Mumbai, India

Skills:

react.js GcpTypescriptNode.jsJavascriptPythonLangGraphagentic workflowcloud-native architectureSQL databasesAI applications

Mumbai, India

Skills:

TensorflowPytorchPythonApache SparkGitNumpyKubeFlowLangchainRayPineconeDaskLlamaindexDSPyMLFlowHF TransformersSLURMvLLMDVCLanggraphPytorch DDPMilvusW BNCCLWeaviatenemo

Mumbai, India

Skills:

.NETApisTypescriptPythonLangChainembeddingsRAG vector databasesDevOps practicesAI observability practicesagentic workflowsAI retrieval systemshigh-code stacksMicrosoft Agent Framework

Mumbai, India

Skills:

FastAPIPythonCrewAIGooglepgvectorQdrantAnthropicPineconeAutoGenLangGraphLLM APIsChromaQwenMistralLlamaOpenAIWeaviateLlamaIndex

Mumbai, India

Skills:

TensorflowSqlAWSPytorchPythonAzureGcpNumpyPandasScikit-learn