Search by job, company or skills

Brillio

Senior Data Engineer - R01561387

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 5 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Senior Data Engineer

Primary Skills

  • DataStream, ETL Fundamentals, SQL, SQL (Basic + Advanced), Python, Data Warehousing, Time Travel and Fail Safe, Snowpipe, SnowSQL, Modern Data Platform Fundamentals, PLSQL, T-SQL, Stored Procedures

Specialization

  • Snowflake Engineering: Data Engineer

Job requirements

  • Job Summary

We are seeking a Senior Data Engineer with strong expertise in DBT, Python, and SQL to design, develop, and optimize scalable data pipelines and ETL processes. The ideal candidate will have hands-on experience working with modern data stacks, particularly Snowflake, and will play a key role in building reliable analytics and reporting solutions on cloud platforms.

This role demands strong problem-solving skills, deep understanding of data warehousing concepts, and the ability to collaborate closely with business stakeholders to translate requirements into robust technical solutions.

Key Responsibilities

  • Design, develop, and maintain DBT models, transformations, tests, and optimized SQL code for analytics and reporting use cases.
  • Build and enhance scalable ETL/ELT pipelines using DBT and complementary data engineering tools.
  • Write complex, high-performance SQL queries to process, analyze, and transform large datasets.
  • Develop clean, efficient, and scalable code in Python, with strong hands-on experience using Pandas and NumPy.
  • Optimize data pipelines for performance, scalability, and cost efficiency, especially in cloud-based data warehouses.
  • Troubleshoot and resolve ETL failures, performance bottlenecks, and data quality issues.
  • Develop and maintain scripts using Unix Shell scripting, Python, and related tools for data extraction, transformation, and loading.
  • Write, tune, and support Snowflake SQL and contribute to Snowflake implementation and optimization initiatives.
  • Work with workflow orchestration tools such as Apache Airflow (or similar platforms) to manage and monitor data pipelines.
  • Integrate data solutions with user-facing applications where required.
  • Collaborate with business, analytics, and technical stakeholders to gather requirements and deliver reliable, interpretable data solutions.
  • Implement data validation, testing, and quality checks within DBT workflows to ensure accuracy and reliability.

Required Skills & Qualifications

  • 6+ years of overall IT experience, with strong focus on data engineering.
  • Proven hands-on experience with DBT (Data Build Tool) including model development, transformations, and testing.
  • Strong programming expertise in Python (mandatory).
  • Advanced proficiency in SQL, including writing and optimizing complex queries on large datasets.
  • Hands-on experience designing, building, and maintaining ETL/ELT pipelines.
  • Experience with Unix/Linux shell scripting.
  • Solid understanding of data warehousing concepts, dimensional modeling, and best practices.

Preferred / Good-to-Have Skills

  • Experience with Snowflake implementation, performance tuning, and cost optimization.
  • Knowledge or hands-on exposure to Salesforce CDP.
  • Experience using Airflow or other orchestration frameworks.
  • Exposure to cloud platforms such as AWS, GCP, or Azure.
  • Experience working with cloud storage solutions like Amazon S3, Google Cloud Storage (GCS), or Azure Blob Storage.

Ideal Candidate Profile

  • Strong analytical and problem-solving skills
  • Comfortable working in fast-paced, data-driven environments
  • Ability to own data pipelines end-to-end
  • Strong collaboration and communication skills

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 146109363

Similar Jobs

Early Applicant