Search by job, company or skills

LiveRamp

Senior Data Engineer

new job description bg glownew job description bg glownew job description bg svg
  • Posted 17 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

You will:

  • Contribute to the design and maintenance of data pipelines that support business and product decision-making.
  • Support the reliability and performance of data workflows across FiveTran, DBT, and Google Cloud (BigQuery).
  • Contribute to team-wide adoption of best practices for ETL and data automation with a focus on high data quality and system resilience.
  • Administer and optimize data platforms, monitoring data systems for efficiency, scalability, and cost-effectiveness.
  • Leverage frameworks and tools that support data pipelines to improve the reliability and observability of our data ecosystem.
  • Partner with cross-functional teams (Data Science, Analytics, Engineering) to ensure that data solutions meet organizational needs.
  • Mentor and onboard other engineers, contributing to shared standards and technical excellence.


Measures of success

  • :Build and support reliable, automated data pipelines with minimal downtime and SLAs consistently met
  • .Demonstrate improvements in data platform efficiency and scalability metrics
  • .Adoption of best practices and data standards across teams

.
Your team will

  • : Build and maintain LiveRamp's data infrastructure, enabling efficient, secure, and high-quality data flow across system
  • s.Drive collaboration across engineering and analytics teams to support data-driven initiative
  • s.Tackle large-scale data challenges that impact multiple business domains and customer-facing system

s.
About yo

  • u: 5+ years of experience in Data Engineering or a related technical ro
  • le.Deep expertise in data modeling, ETL/ELT architecture, and S
  • QL.Proficiency in Python for data transformation, automation, and tooli
  • ng.Hands-on experience with modern data stack tools such as DBT, FiveTran, and Airflow (or similar ETL orchestration system
  • s).Experience with at least one major cloud data platform (e.g., GCP BigQuery, AWS Redshift, Azure Synapse, or Snowflak
  • e).Understanding of data reliability, governance, and distributed computi
  • ng.Experience supporting data platforms (access, cost optimization, monitorin
  • g).Excellent communication skills and ability to work cross-functionally with technical and non-technical partne
  • rs.A self-starter who thrives in fast-paced, evolving environmen

ts.
Preferred Ski

  • lls:Hands-on experience with Google Cloud Platform (BigQue
  • ry).Experience with PySpark or distributed data processing framewo
  • rks.Experience building custom data frameworks or automation tool
  • ing.Exposure to data product management and data observability platfo

rms.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 145301705

Similar Jobs