Search by job, company or skills

Vertiv

IT Data Analytics Specialist (ETL Developer)

4-7 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted 3 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Description

Job Title: Senior Data Engineer

Function/Department: Global Data Analytics

Reports to: Data Analytics Supervisor

Location: Pune

Overview

We are seeking a highly skilled Senior Data Engineer with deep expertise in Snowflake and strong handson experience with Matillion or similar ETL/ELT tools. The ideal candidate will design, build, and optimize scalable data pipelines, ensure data quality, and contribute to the modernization of our data architecture.

This role is ideal for someone who thrives in complex, enterpriselevel environments, collaborates effectively with crossfunctional teams, and is passionate about highquality, reliable data solution.

Key Responsibilities

Data Architecture & Engineering

  • Design, build, and maintain scalable ETL/ELT pipelines using Matillion or comparable tools (Informatica, Talend, dbt, ADF, etc.).
  • Develop and optimize Snowflake objects, including warehouses, schemas, tables, views, streams, tasks, and stored procedures.
  • Implement robust data ingestion frameworks for batch and realtime data flows.
  • Optimize Snowflake performance through clustering, caching, micro-partitioning, and resource monitoring.

Data Quality & Governance

  • Ensure the integrity, availability, and reliability of data across all systems.
  • Define and automate data validation, profiling, and quality checks.
  • Collaborate with data governance teams to implement standards and policies.

Collaboration & Delivery

  • Work closely with data analysts, data scientists, product teams, and business stakeholders to understand requirements and translate them into technical solutions.
  • Lead code reviews, provide mentorship, and support best practices across the engineering team.
  • Contribute to architectural decisions and participate in strategic data roadmap planning.

Automation & DevOps

  • Build CI/CD pipelines for data workflow deployments.
  • Develop reusable frameworks or templates to increase efficiency of future data engineering builds.
  • Implement monitoring, alerting, and operational excellence practices for data pipelines.

Required Qualifications

  • 4 to 7 years of experience as a Data Engineer or similar role.
  • Strong expertise with Snowflake (SQL, performance tuning, data modeling, warehouse configuration).
  • Hands-on experience with Matillion or other ETL/ELT tools (e.g., Talend, Informatica, dbt, Azure Data Factory).
  • Advanced SQL skills ability to write complex queries and optimize them.
  • Strong understanding of data modeling, dimensional models, and data warehouse concepts.
  • Experience with cloud platforms such as AWS, Azure, or GCP.
  • Experience in Oracle Business Application (ERP) is a plus.
  • Proficiency in Python, Airflow, or other orchestration tools.
  • Familiarity with Git, CI/CD, and general DevOps practices.

More Info

Job Type:
Employment Type:

About Company

Liebert Corporation is a global manufacturer of power, precision cooling and infrastructure management systems for mainframe computer, server racks, and critical process systems. A subsidiary of Vertiv, It is headquartered in Columbus, Ohio, and employs more than 1,800 people across 12 manufacturing plants worldwide.

Job ID: 144237043