Search by job, company or skills

  • Posted 15 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Description

We are seeking a highly motivated and skilled AI/ML Engineer to contribute to cutting-edge research and development within our team. This role offers a unique opportunity to work on challenging projects, collaborate with business, and gain practical experience in the application of AI/ML techniques to real-world problems. The successful candidate will be involved in all stages of the machine learning lifecycle, from data preprocessing and feature engineering to model training, evaluation, and deployment. A strong theoretical foundation and hands-on coding experience are essential.

Responsibilities

  • Bachelor's degree in Computer Science / Computer Engineering or a similar technical discipline.
  • 2+ years of work experience as a backend Cloud software engineer with familiarity in at least one major cloud platform (GCP) and in Python
    • Advanced working knowledge of object-oriented/object function programming languages: Python
    • Advance query knowledge using SQL
    • Experience/understanding in MLOps, Gen AI, AI agents, Chatbot.
    • Experience with ML workflow orchestration tools: Airflow, Kubeflow etc.
    • Experience in DevOps and CI/CD principles: Jenkins, Tekton, Cloud Build, GitHub Actions etc.
    • Experience with scripting language: Bash, PowerShell etc.
    • Operationalize machine learning models by building data infrastructure and managing structured and unstructured data, supporting AI/ML/LLM workflows, including data labeling, classification, and document parsing.
    • Collaborate with data scientists, Data engineers, and other stakeholders to understand data needs and deliver solutions aligned with business objectives, security, and data governance.
    • Utilize ML Services like Vertex AI.
    • Document data processes, pipeline designs, and architecture, contributing to knowledge transfer and system maintenance.
    • Experience with cloud services, preferably GCP Services like Vertex AI, Cloud Function, Cloud Run, BigQuery etc.
    • Experience in container management solutions: Kubernetes, Docker.
    • Experience with Infrastructure as Code: Terraform etc.
    • Automate infrastructure and deployments using Infrastructure as Code (IaC) using tools like
    • Monitor and troubleshoot data pipelines and systems to identify and resolve issues related to performance, reliability, and cost-effectiveness.
    • DevOps & MLOps: Knowledge of DevOps methodologies, CI/CD pipelines, and MLOps practices, including integrating data pipelines with ML workflows.
    • Data Engineering Fundamentals: Solid understanding of data modeling, data warehousing concepts, ETL/ELT processes, and big data architecture, Designing pipelines and architectures for data processing.
    • Communication & Collaboration: Excellent communication and teamwork skills, with the ability to collaborate effectively with technical and non-technical stakeholders in agile environments.

Qualifications


  • 2+ years experience
  • GCP Expertise: Strong proficiency in GCP services, including BigQuery, Dataflow, Dataproc, Data Fusion, Air Flow, Pub/Sub, Cloud Storage, Vertex AI, Cloud Functions, and Cloud Composer, GCP based Big Data deployments (Batch/Real-Time) leveraging Big Query, Big Table
  • Programming & Scripting: Expert-level skills in Python and SQL are essential.
  • DevOps & MLOps: Knowledge of DevOps methodologies, CI/CD pipelines, and MLOps practices, including integrating data pipelines with ML workflows.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 145341855

Similar Jobs