Search by job, company or skills

MetLife

Data Engineer

4-8 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted a month ago
  • Be among the first 20 applicants
Early Applicant

Job Description

Note: This job role is part of MetLife's Hack4Job India (a hiring hackathon). Only shortlisted candidates will be invited.

Department:Global Technology

Role Overview

MetLife is seeking an experienced Data Engineer to drive our digital and AI transformation journey. This role focuses on building modern data platforms, enhancing data storage and access, and ensuring seamless data consumption through APIs. The ideal candidate will work with Azure Cloud technologies to build robust data pipelines, data lakes, and marts to support business analysts and data scientists.

Key Responsibilities

  • Modern Data Platform Development:
  • Build data lake components on cloud-based platforms
  • Design and develop data marts for business analysts and data scientists
  • Data Engineering & Pipelines:
  • Design data pipelines to integrate structured, semi-structured,and unstructured data from multiple sources
  • Implement ETL/ELT processes to transform and cleanse data
  • Ensure data qualityand transformation rules align with Enterprise standards
  • Work with Medallion architecture and implement best practices for data modeling
  • Agile & DevOps Practices:
  • Deliver solutions using Agile methodologies in a CI/CD-driven environment
  • Work on containerized solutions (Azure Kubernetes) and scheduling tools like Azure Scheduler
  • Follow secure coding practices and authentication/authorization protocols

Candidate Qualifications

  • Education: Bachelor's degree in computer science or equivalent
  • Experience: 4 - 8 years of experience in data engineering or data application development (ETL/ELT/BI)
  • 2+ years of experience in cloud-based data platform development
  • Expertise in building Azure-based data pipelines, including:
  • Azure Data Factory / Synapse
  • DataBricks / Synapse Spark Pool
  • Cosmos DB
  • Azure Data Lake Storage (ADLS)
  • Dedicated SQL Pool / Azure SQL
  • Azure Logic Apps
  • Hands-on experience with data transformation and cleansing using Spark, Python, R, SQL
  • Strong understanding of CI/CD, test-driven development, and domain-driven design

Skills & Competencies

Technical Expertise:

  • Proficiency in Python, SQL, Spark, Azure Data Factory, and ETL processes
  • Experience in secure coding, authentication, and monitoring tools like Veracode, MS Entra, PingOne
  • Working knowledge of Azure Kubernetes, Azure DevOps, SonarQube, and Azure AppInsights

Soft Skills: Strong communication and collaboration in a global, multi-cultural environments (experience in a Japanese work environment is a plus)

  • Able to work in a high-paced, diverse environment with a can-do attitude

Language: Business proficiency in English; Japanese language is a plus

This is a great opportunity to be part of MetLife's technology transformation journey.

More Info

Job Type:
Industry:
Function:
Employment Type:

About Company

Job ID: 141197997

Similar Jobs