Senior Data Engineer / Analytics Engineer (India-Based)
Hourly Contract | Remote | $35$70 per hour
Mercor is partnering with a cutting-edge AI research lab to hire a Senior Data / Analytics Engineer with expertise across DBT and Snowflake Cortex CLI. In this role, you will build and scale Snowflake-native data and ML pipelines, leveraging Cortex's emerging AI/ML capabilities while maintaining production-grade DBT transformations. You will work closely with data engineering, analytics, and ML teams to prototype, operationalise, and optimise AI-driven workflowsdefining best practices for Snowflake-native feature engineering and model lifecycle management. This is a high-impact role within a modern, fully cloud-native data stack.
Responsibilities
- Design, build, and maintain DBT models, macros, and tests following modular data modeling and semantic best practices
- Integrate DBT workflows with Snowflake Cortex CLI, enabling:
- Feature engineering pipelines
- Model training & inference tasks
- Automated pipeline orchestration
- Monitoring and evaluation of Cortex-driven ML models
- Establish best practices for DBTCortex architecture and usage patterns
- Collaborate with data scientists and ML engineers on Cortex workloads
- Build and optimise CI/CD pipelines for DBT (GitHub Actions, GitLab, Azure DevOps)
- Tune Snowflake compute and queries for performance and cost efficiency
- Troubleshoot DBT artifacts, Snowflake objects, lineage, and data quality issues
- Provide guidance on DBT project governance, structure, documentation, and testing
Required Qualifications
- 3+ years of experience with DBT Core or DBT Cloud (macros, packages, testing, deployments)
- Strong expertise with Snowflake (warehouses, tasks, streams, materialized views, performance tuning)
- Hands-on experience with Snowflake Cortex CLI or ability to ramp up quickly
- Strong SQL skills; working familiarity with Python for scripting and automation
- Experience integrating DBT with orchestration tools (Airflow, Dagster, Prefect, etc.)
- Solid understanding of modern ELT patterns and version-controlled analytics development
Nice-to-Have Skills
- Experience operationalising ML workflows inside Snowflake
- Familiarity with Snowpark, Python UDFs/UDTFs
- Experience building semantic layers using DBT metrics
- Knowledge of MLOps / DataOps best practices
- Exposure to LLM workflows, vector search, or unstructured data pipelines
PS: Mercor reviews applications daily. Please complete your interview and onboarding steps to be considered for this opportunity.