Experience: 3.00 + years
Salary: INR 2500000-4500000 / year (based on experience)
Shift: (GMT+05:30) Asia/Kolkata (IST)
Opportunity Type: Remote
Placement Type: Full time Permanent Position
(*Note: This is a requirement for one of Uplers client - Nuaav)
What do you need for this opportunity
Must have skills required:
Snowflake, Snowflake SQL, Snowflake stored procedures, Python, Sql (advanced), SQL Server, SQL Refactoring, Migration, Cloud data pipelines
Nuaav is Looking for:
- Location: Noida / Remote / Hybrid
- Experience: 310 Years
- Employment Type: Full-time
Company Overview
Nuaav is a boutique technology consulting firm specializing in scalable data engineering, cloud modernization, and AI-driven transformation. We partner with enterprises to build modern data platforms, streamline migrations, and deliver high-quality engineering solutions with agility and precision.
Role Summary
- We are looking for an experienced Data Engineer with strong expertise in Snowflake and/or Greenplum, and hands-on experience in data pipeline development, SQL optimization, and cloud migration projects.
- You will work on end-to-end data modernization initiatives, including Greenplum Snowflake migrations, ETL/ELT development, performance tuning, and automated data-processing workflows.
- This is a high-impact role where you will collaborate with architects, product teams, and clients to build secure, scalable, and efficient data ecosystems.
Key Responsibilities
- Snowflake & Greenplum Engineering
- Design, develop, and optimize Snowflake objectstables, views, stored procedures, tasks, streams, Snowpipe, COPY pipelines.
- Migrate existing logic from Greenplum / Hadoop / SQL Server / Oracle into Snowflake stored procedures and scripts.
- Support end-to-end migration cycles: schema migration, code refactoring, unit testing, data validation, performance tuning.
Data Pipelines & ETL/ELT
- Build and maintain ETL/ELT workflows using tools such as SSIS, Matillion, Talend, AWS Glue, or custom Python/SQL-based pipelines.
- Work on ingestion frameworks extracting data from relational, NoSQL, and cloud sources into Snowflake/Redshift/S3.
- Implement automation routines, pipeline orchestration, and job monitoring using Control-M, Lambda, or Azure/AWS DevOps.
- Data Optimization & Engineering Best Practices
- Tune and optimize complex SQL queries using Snowflake Query Profile, CTEs, dynamic SQL, clustering, partitioning, and caching techniques.
- Conduct data validation between legacy (Greenplum/Hadoop/Oracle) and target Snowflake environments.
- Ensure data quality, performance, and reliability across all pipelines.
Collaboration & Delivery
Work closely with architects, analysts, and stakeholders to understand data requirements and translate them into scalable solutions.
Contribute to cloud modernization initiatives, POCs, and performance benchmarking.
Support DevOps practices using Git-based version control, CI/CD, automated deployments, and environment migration.
Required Skills & Experience
Core Technical Expertise
Strong hands-on experience with Snowflake (must have) including:
Stored procedures, UDFs, tasks, streams
Time Travel, Cloning, Fail-safe
Snowpipe, COPY INTO, Staging & Data Loading
Query optimization & profiling
Experience with Greenplum, PostgreSQL, or equivalent MPP systems.
Strong SQL development skills and ability to troubleshoot and optimize large-scale queries.
Experience with ETL tools: SSIS, Matillion, Talend, AWS Glue, Informatica, or similar.
Cloud experience (AWS/Azure) with S3 buckets, Lambda, Redshift, or Azure equivalents.
Preferred / Good To Have
Python scripting for data processing and validation.
Experience migrating SQL logic between heterogeneous systems (Oracle Snowflake, Greenplum Snowflake, SQL Server Redshift).
Knowledge of microservices or API integrations (from resume inputs).
Familiarity with CI/CD, DevOps pipelines, Git/Bitbucket/GitHub.
Exposure to reporting tools (Power BI, Tableau).
Soft Skills
Strong problem-solving, analytical, and debugging skills.
Ability to collaborate in agile, multi-stakeholder environments.
Excellent communication and documentation capabilities.
Education
B.E./B.Tech/M.Tech/MCA in Computer Science, Information Technology, or a related field.
Why Work With Nuaav
- Work on large-scale Snowflake migration projects and modern cloud data platforms
- High ownership and meaningful workyour decisions matter
- Opportunity to learn across data engineering, cloud, and AI initiatives
- Fast-paced but supportive consulting environment
- Direct access to leadership and architects
- Flexible work model (Noida office + hybrid/remote options)
How to apply for this opportunity
- Step 1: Click On Apply! And Register or Login on our portal.
- Step 2: Complete the Screening Form & Upload updated Resume
- Step 3: Increase your chances to get shortlisted & meet the client for the Interview!
About Uplers:
Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement.
(Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well).
So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!