Search by job, company or skills

TEKsystems

Senior Data Engineer

new job description bg glownew job description bg glownew job description bg svg
  • Posted 9 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

WHAT YOU'LL DO:

  • Design data pipelines utilizing ETL tools, event driven software, and other streaming software.
  • Design data architecture that is simple and maintainable while enabling Data Analysts, Data Scientists, and stakeholders to efficiently work with data.
  • Partner with both data scientists and engineers to bring our amazing concepts to reality. This requires learning to speak the language of statisticians as well as software engineers.
  • Ensure reliability and quality in data pipelines and enforce data governance, security and protection of our customer information while balancing tech debt.
  • Partner with product and engineering teams to design data models for downstream consumers.
  • Evaluate and champion new engineering tools that help us move faster and scale our team

WHAT YOU'LL NEED:

  • You have 5 years of data engineer experience coding in Python and using SQL.
  • Experience working with at least one major Datawarehouse in BigQuery, Snowflake, Databricks
  • Bachelor's degree in technology or data science or a related filed, work experience can be substituted for education
  • You are comfortable with data engineering tools such as Airflow, dbt, AirByte/ Fivetran, Atlassian suite, GitHub, Terraform as well as Kubernetes.
  • You understand standard ETL patterns, modern data warehousing ideas such as data mesh or data vaulting, and data quality practices regarding test driven design and data observability.
  • You have experience partnering with data analysts, data scientists, and engineering stakeholders.
  • You have a background in data engineering, computer science, software development or another engineering field
  • You can communicate with a team and articulate ideas to both team members and non-technical stakeholders
  • You have a drive to learn and master new technologies and techniques
  • Bachelor's degree or relevant experience can substitute for formal education

Technologies You Will Use

  • Python/ Airflow for data pipeline and automation.
  • Airbyte for ETL purpose
  • dbt for data transformation
  • Snowflake/ BigQuery as Data Warehouse
  • AWS, GCP, Terraform, Kubernetes, GitHub and more: we keep adopting new tools as we grow!

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 145599549