Search by job, company or skills

VAYUZ Technologies

ETL Developer

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 3 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Description

Company Overview :

VAYUZ Technologies is a dynamic and rapidly growing technology company specializing in providing cutting-edge data solutions to the financial services industry. We empower our clients with actionable insights through robust data warehousing, business intelligence, and advanced analytics platforms. Our expertise lies in building scalable and reliable data pipelines that enable informed decision-making and drive business growth.

Role Overview

As an Extract Transform Load (ETL) Developer at VAYUZ Technologies, you will be responsible for designing, developing, and maintaining ETL processes to ingest, transform, and load data from various sources into our data warehouse. You will collaborate closely with data architects, data analysts, and business stakeholders to understand data requirements and translate them into efficient and scalable ETL solutions. Your work will directly impact the accuracy and availability of data used for critical business reporting and analytics, ultimately contributing to improved decision-making across the organization.

Key Responsibilities

  • Design and develop ETL workflows using Azure Data Factory and other relevant tools to extract data from diverse sources, including Oracle databases and flat files.
  • Implement data quality checks and validation rules within ETL processes to ensure data accuracy and consistency.
  • Optimize ETL performance by identifying and resolving bottlenecks in data processing pipelines.
  • Collaborate with data architects to define data models and schemas that support business requirements.
  • Develop and maintain shell scripts and Control-M jobs for scheduling and monitoring ETL processes.
  • Troubleshoot and resolve data-related issues, working closely with data analysts and business users.
  • Create and maintain comprehensive documentation for ETL processes and data mappings.
  • Implement and maintain CI/CD pipelines using Jenkins for automated deployment of ETL code.
  • Participate in code reviews and contribute to the development of best practices for ETL development.

Required Skillset

  • Proven ability to design, develop, and maintain complex ETL processes using industry-standard tools and techniques.
  • Strong expertise in Azure Data Factory, including experience with data flows, pipelines, and triggers.
  • Solid understanding of Oracle database concepts and experience writing complex SQL queries and stored procedures.
  • Proficiency in scripting languages such as Shell Scripting and experience working in a Linux environment.
  • Experience with job scheduling and workload automation tools such as Control-M.
  • Familiarity with CI/CD pipelines and tools such as Jenkins for automated deployment.
  • Excellent communication and collaboration skills, with the ability to effectively communicate technical concepts to both technical and non-technical audiences.
  • A Bachelor's degree in Computer Science, Information Technology, or a related field.

(ref:hirist.tech)

More Info

Job Type:
Industry:
Function:
Employment Type:

About Company

Job ID: 146454439

Similar Jobs