Search by job, company or skills

H

Principal Data Engineer

8-13 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted a month ago
  • Over 50 applicants
Quick Apply

Job Description

Architecture & Strategy

  • Own a significant portion of the data platform architecture, ensuring scalability, performance, reliability, and security.
  • Define technical standards and best practices for data modeling, transformation, orchestration, governance, and lifecycle management.
  • Evaluate and integrate modern data technologies that align with the long-term platform strategy.
  • Collaborate with engineering and product leadership to shape the technical roadmap.

Engineering & Delivery

  • Design, build, and manage scalable, resilient data pipelines for batch, streaming, and event-driven workloads.
  • Develop high-quality data models and schemas to support analytics, BI, operational systems, and ML workflows.
  • Implement data quality, lineage, observability, and automated testing frameworks.
  • Build ingestion patterns for APIs, event streams, files, and third-party data sources.
  • Optimize compute, storage, and transformation layers for performance and cost efficiency.

Leadership & Mentorship

  • Serve as a senior technical leader and mentor within the data engineering team.
  • Lead architecture reviews, design discussions, and cross-team engineering initiatives.
  • Guide analysts, data scientists, software engineers, and product owners in delivering robust data solutions.
  • Communicate architectural decisions and trade-offs to both technical and non-technical stakeholders.

Required Qualifications & Skills:

  • 8+ years of experience in Data Engineering with proven architectural ownership.
  • Expert-level experience with Snowflake, including performance optimization, data modeling, security, and ecosystem components.
  • Expert proficiency in SQL and strong Python skills for pipeline development and automation.
  • Experience with modern orchestration tools such as Airflow, Dagster, Prefect, or equivalents.
  • Strong understanding of ELT/ETL patterns, distributed processing, and data lifecycle management.
  • Familiarity with streaming/event technologies (Kafka, Kinesis, Pub/Sub).
  • Experience implementing data quality, observability, and lineage solutions.
  • Solid understanding of cloud infrastructure (AWS, GCP, or Azure).
  • Strong background in DataOps practices: CI/CD, testing, version control, automation.
  • Proven leadership in driving architectural direction and mentoring engineering teams.

Nice to Have:

  • Experience with data governance or metadata management tools.
  • Hands-on experience with DBT (modeling, testing, documentation, advanced features).
  • Exposure to machine learning pipelines, feature stores, or MLOps.
  • Experience with Terraform, CloudFormation, or other IaC tools.
  • Experience designing systems for high scale, security, or regulated environments.

More Info

Job Type:
Function:
Employment Type:
Open to candidates from:
Indian

About Company

The Hewlett-Packard Company, commonly shortened to Hewlett-Packard or HP, was an American multinational information technology company headquartered in Palo Alto, California. HP developed and provided a wide variety of hardware components, as well as software and related services to consumers, small and medium-sized businesses (SMBs), and large enterprises, including customers in the government, health, and education sectors. The company was founded in a one-car garage in Palo Alto by Bill Hewlett and David Packard in 1939, and initially produced a line of electronic test and measurement equipment. The HP Garage at 367 Addison Avenue is now designated an official California Historical Landmark, and is marked with a plaque calling it the "Birthplace of 'Silicon Valley'".

Job ID: 139864417

Similar Jobs