Company Overview
VAYUZ Technologies is a leading provider of cutting-edge data solutions, specializing in helping businesses unlock the power of their data through advanced analytics and cloud-based technologies.
We operate across diverse industries, empowering organizations to make data-driven decisions and gain a competitive edge. Our expertise lies in building robust data pipelines, implementing sophisticated data models, and delivering actionable insights.
Role Overview
As a DBT Data Engineer at VAYUZ Technologies, you will be instrumental in designing, developing, and maintaining our data transformation pipelines using
Data Build Tool (DBT) within an
Azure cloud environment.
You will collaborate closely with data scientists, data analysts, and other engineers to ensure the delivery of high-quality, reliable data for critical business initiatives. Your work will directly impact the efficiency and accuracy of our data-driven decision-making processes, enabling us to better serve our clients and drive business growth.
Key Responsibilities
- Develop and maintain data transformation models using DBT, ensuring data quality and consistency for downstream analytics.
- Design and implement efficient data pipelines within the Azure cloud environment, optimizing for performance and scalability.
- Collaborate with data scientists and analysts to understand data requirements and translate them into robust DBT models.
- Automate data pipeline processes using Control-M and Shell Scripting, reducing manual effort and improving operational efficiency.
- Monitor and troubleshoot data pipeline issues, ensuring timely resolution and minimal disruption to data availability.
- Contribute to the development of data governance standards and best practices, promoting data quality and compliance.
- Participate in code reviews and contribute to the continuous improvement of our data engineering practices.
Required Skillset
- Demonstrated ability to design and implement data transformation pipelines using Data Build Tool (DBT).
- Proven expertise in working with Azure cloud services, including Azure Data Factory, Azure Databricks, and Azure SQL Database.
- Strong proficiency in SQL, particularly with PostgreSQL, for data querying and manipulation.
- Experience with scripting languages such as Python and Shell Scripting for automating data pipeline tasks.
- Solid understanding of data warehousing concepts and data modeling techniques.
- Familiarity with version control systems like Git and Linux environments.
- Excellent communication and collaboration skills, with the ability to work effectively in a team environment.
Good To Have
- Exposure to Flyway or Artifactory
- Experience in handling large-scale data pipelines
(ref:hirist.tech)