AI/ML Solution Development:
- Lead and develop scalable AI solutions using ML, DL, and Generative AI techniques.
- Architect large-scale AI systems integrating models into the software development lifecycle (SDLC).
- Develop, implement, and optimize machine learning models, including data pre-processing, feature engineering, model selection, hyperparameter tuning, and training on large datasets.
- Deploy models in production while ensuring scalability, performance, and security.
- Integrate AI/ML models with existing software systems and infrastructure for smooth operation.
Data Pipelines and Analytics:
- Build and optimize data pipelines for ETL (extraction, transformation, loading) from diverse sources.
- Leverage analytics tools to extract insights from customer, operational, and bug datasets.
- Identify trends and patterns to inform automation opportunities and process improvements.
- Support data specialists to enhance data system functionality and performance.
Technical Leadership and Mentorship:
- Organize and lead code and design review sessions, aligning with project requirements and best practices.
- Mentor junior and mid-level team members, providing feedback and guidance.
- Set design and implementation standards with engineering managers and team leads.
- Deliver strategic presentations and reports to senior stakeholders with actionable insights.
Research and Innovation:
- Stay updated on AI/ML technologies, frameworks, and algorithms.
- Experiment with cutting-edge techniques to solve complex problems and enhance existing models.
- Work with Large Language Models, Generative AI, and Conversational AI.
Qualifications:
- 9+ years of experience in a Data Science or AI/ML role.
- Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or related quantitative field.
- 5+ years building production-grade data pipelines for AI/ML solutions.
- Strong foundation in mathematics, statistics, linear algebra, calculus, probability, and complex ML algorithms.
- Hands-on coding experience in Python and other relevant programming languages.
- Familiarity with SDLC, Agile principles, and big data ecosystems.
- Experience with cloud platforms (AWS, Databricks, Snowflake) and big data tools (Hadoop, Spark, Kafka).
- Knowledge of relational and NoSQL databases (Postgres, Cassandra) and workflow management tools.