Key Responsibilities:
- 4 - 7 years experience
- Design and implement end-to-end generative AI solutions including RAG chatbots, LLM-powered BI systems, and coding agents
- Develop and deploy AI agents using frameworks like LangGraph and similar orchestration tools
- Build robust data processing pipelines using LlamaIndex and related libraries for document ingestion and retrieval
- Implement and optimize vector databases for semantic search and retrieval systems
- Integrate multiple LLM providers (OpenAI, Gemini, Anthropic) and evaluate model performance
- Set up comprehensive observability and monitoring using tools like LangSmith
- Collaborate with engineering teams to productionize ML models and AI applications
Required Skills:
Generative AI (Minimum Requirements):
- Hands-on experience with LangGraph or similar AI agent frameworks
- Proficiency with LlamaIndex, LangChain, or equivalent data processing libraries
- Experience with vector databases (Pinecone, Weaviate, Chroma, etc.)
- Working knowledge of multiple LLMs and their APIs (OpenAI GPT, Gemini, Claude)
- Experience with LLM observability tools (LangSmith, Weights & Biases, etc.)
- Proven track record in building LLM BI solutions (natural language to SQL)
- Experience developing RAG (Retrieval-Augmented Generation) chatbots
- Experience with coding agents and code generation systems
Traditional AI/ML:
- Strong foundation in clustering, regression, classification, and forecasting
- Proficiency with scikit-learn, PyTorch, TensorFlow
- Experience with statistical analysis and experimental design
- Knowledge of feature engineering and data preprocessing techniques
Good to Have:
- Fine-tuning experience with LLMs, rerankers, or embedding models
- Self-hosting and deployment of open-source LLMs
- Experience with BERT, transformer architectures, or computer vision models (YOLO)
- MLOps experience with MLflow, Weights & Biases, or TensorBoard
- Cloud platform certifications (AWS, GCP, Azure)
Additional Requirements:
- Strong programming skills in Python
- Experience with containerization (Docker, Kubernetes)
- Knowledge of API development and microservices architecture
- Understanding of prompt engineering and prompt optimization techniques
- Experience with evaluation frameworks for LLM applications
- Familiarity with data privacy and security best practices for AI applications