Search by job, company or skills

Marktine Technology Solutions Pvt Ltd

AWS Data Engineer (Data Build Tool)

new job description bg glownew job description bg glownew job description bg svg
  • Posted a day ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Title : DBT Developer

Experience- 5 + Years

Location: Remote

Key Responsibilities

DBT Development & Modeling: Design, develop, test, and deploy production-grade, secure, and performant data models using dbt (Jinja and SQL) in a modular and incremental fashion.

Databricks Integration: Utilize dbt to transform data residing in Databricks Unity Catalog and Delta Lake tables, ensuring models are optimized for consumption by Snowflake and other downstream systems.

Performance Optimization: Optimize dbt models and associated Databricks queries (SQL/Spark) for scale, efficiency, and cost, leveraging partitioning, clustering, and table optimization techniques (e.g., Z-Ordering).

Code Quality & Governance: Establish and enforce best practices for dbt development, including documentation, version control (GitLab/Git), code reviews, and comprehensive testing (data quality, uniqueness, referential integrity).

CI/CD Integration: Integrate dbt projects into the CI/CD pipeline (managed via GitLab) for automated testing and deployment across development, staging, and production environments.

Data Governance: Work closely with Data Governance and Compliance teams to ensure models handle sensitive data (e.g., OneTrust consent data)

Required Qualifications

Minimum 5+ years of professional experience in Data Engineering, Business Intelligence, or Analytics Engineering.

2+ years of hands-on, production experience with dbt (data build tool).

Expert proficiency in SQL and experience with complex data modeling techniques (e.g., dimensional modeling).

Proven experience with Databricks/Spark environment, specifically reading from Delta Lake tables and utilizing Databricks SQL or standard Spark SQL for transformations.

Demonstrable experience working with data specific to the Pharmaceutical or Life Sciences industry, including understanding of systems like Veeva CRM, HCP/HCO, Address & Affiliation data, Sales & Interactions data.

Experience with CI/CD practices and tooling (e.g., GitLab, GitHub Actions) for automated code deployment.

Familiarity with Cloud Data Warehouses, preferably Snowflake or similar platforms.

More Info

Job Type:
Industry:
Employment Type:

Job ID: 134561665