
Search by job, company or skills
Inviting applications for the role of Senior Principal Consultant- DBT Engineer - Data Platform Architect
Job Summary:
We are seeking a highly skilled DBT Engineer - Data Platform Architect with a strong background in data engineering, modern data platforms, and cloud data warehouses - specifically DBT and Snowflake and Databricks as target DBs. This role will be instrumental in designing, building, and optimizing our enterprise-scale data platform and analytics pipeline architecture. You will lead the end-to-end implementation of data transformation workflows, enabling high-quality, governed, and scalable data models to serve business intelligence, analytics, and operational reporting needs.
You will collaborate with data engineers, analysts, product managers, and business stakeholders to architect and implement solutions that deliver trusted and actionable data to drive business outcomes.
Key Responsibilities:
Design and architect scalable, reliable, and performant data platforms leveraging Snowflake and DBT.
Define data modeling standards (e.g., star/snowflake schema) and implement dimensional models for analytical use cases.
Establish best practices for managing data workflows using DBT, including modularization, version control, documentation, and testing.
Develop and maintain robust DBT projects (models, macros, tests, sources, snapshots, seeds).
Build curated data models (marts) for analytics and reporting consumption.
Implement DBT testing strategies to ensure data quality and integrity (e.g., uniqueness, non-null, referential integrity checks).
Snowflake Implementation:
Leverage Snowflake%27s features such as virtual warehouses, zero-copy cloning, time travel, and data sharing.
Optimize Snowflake performance through partitioning, clustering, and query tuning.
Manage access controls, role-based security, and data governance using Snowflake and dbt.
Data Pipeline Orchestration:
Collaborate with data engineers to integrate DBT models into orchestration tools (e.g., Airflow, dbt Cloud, Dagster).
Ensure data pipelines are resilient, observable, and meet SLAs.
Work with cross-functional teams to understand data requirements and deliver data solutions.
Provide mentorship and guidance to junior engineers and analysts on DBT and Snowflake best practices.
Conduct code reviews and ensure adherence to CI/CD practices for data infrastructure.
Maintain comprehensive documentation of models, data flows, and architecture.
Support compliance with data privacy and governance policies (e.g., GDPR, HIPAA)
Qualifications we seek in you!
Minimum Qualifications
Experience in data engineering, architecture, and platform design.
Expertise in DBT for data transformation and modeling.
Strong knowledge of Snowflake and Databricks, including performance tuning and optimization.
Hands-on experience with any cloud platform (AWS, Azure, GCP).
Solid understanding of SQL, Python, or Scala for data engineering.
Experience with data pipeline orchestration tools (Airflow, Prefect, dbt Cloud).
Knowledge of data warehousing principles and modern data architectures.
Strong analytical and problem-solving skills.
Excellent communication and collaboration skills.
Preferred Qualifications:
Experience in real-time data processing and streaming technologies (Kafka, Spark Streaming).
Familiarity with CI/CD pipelines for data engineering.
Exposure to machine learning and AI-driven analytics.
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI.
Job ID: 125786953