- Develop, design, and maintain data integration workflows using Ab Initio (Graphical and Co>Operating System).
- Collaborate with business analysts, data engineers, and other stakeholders to understand requirements and translate them into technical solutions.
- Design and implement efficient and scalable ETL processes for data extraction, transformation, and loading.
- Perform data analysis and data profiling to ensure high data quality and identify data issues or inconsistencies.
- Optimize Ab Initio jobs and workflows for performance improvements, ensuring efficient data processing.
- Debug and troubleshoot production issues related to Ab Initio applications, providing timely solutions.
- Write unit tests and conduct system testing to validate the integrity of data processing logic.
- Maintain comprehensive data integration documentation, including data flow diagrams and process documentation.
- Work closely with data architects and infrastructure teams to ensure alignment with the overall data architecture and technology stack.
- Ensure adherence to best practices, coding standards, and operational procedures within the team.
- Continuously enhance Ab Initio knowledge and stay updated with new features and releases.
- Provide mentorship and guidance to junior team members.
Required Skills and Qualifications:
- 4 to 12 years of experience in Ab Initio development, with proven expertise in building and maintaining ETL solutions.
- Strong experience with Ab Initio tools, including Graphical Development Environment, Co>Operating System, and other related components.
- Proficient in SQL for database querying and troubleshooting.
- Solid understanding of data warehousing concepts, ETL processes, and data modeling.
- Experience in data processing within large-scale environments and handling high-volume data pipelines.
- Strong analytical and problem-solving skills with a keen attention to detail.
- Ability to work collaboratively in a team environment and effectively communicate technical solutions to non-technical stakeholders.
- Strong understanding of software development best practices, version control, and deployment processes.
- Experience with scheduling and automation tools for job management.
- Excellent troubleshooting and debugging skills.
Desired Skills (Nice to Have):
- Knowledge of cloud-based data technologies (AWS, GCP, Azure).
- Experience with other ETL tools or technologies such as Informatica, Talend, or DataStage.
- Familiarity with Agile methodologies.
- Exposure to data governance, data security, and compliance standards.
Education:
- Bachelors degree in Computer Science, Information Technology, or a related field. A Master's degree is a plus.
Additional Information:
- Strong communication skills and the ability to work in a fast-paced environment.
- Willingness to learn new technologies and keep up with evolving tools and practices in data engineering.
If you find this opportunity interesting kindly share your below details (Mandatory)
- Total Experience
- Experience in Abinitio-
- Experience in Oracle–
- Experience in Unix-
- Current CTC-
- Expected CTC-
- Notice period-
- Current Location-