Search by job, company or skills

global software solutions group

Data Engineer (PySpark/Informatica BDM)

new job description bg glownew job description bg glownew job description bg svg
  • Posted 19 hours ago
  • Be among the first 20 applicants
Early Applicant

Job Description

We are seeking a skilled Data Engineer to join the Group Risk team, responsible for building and managing robust data pipelines to support IFRS9 reporting. The role involves close collaboration with business stakeholders to understand data requirements, perform impact analysis, and deliver high-quality data solutions using modern data engineering technologies such as PySpark and Informatica BDM.

Key Responsibilities

  • Collaborate with the Group Risk Team to gather and understand business and data requirements
  • Perform impact assessment and technical data mapping for new and existing data sources
  • Conduct data profiling to ensure data quality, consistency, and completeness
  • Design, develop, and maintain ETL pipelines using PySpark and Informatica BDM
  • Build scalable data transformation workflows aligned with IFRS9 data models
  • Ensure accurate data extraction, transformation, and loading (ETL) into reporting systems
  • Participate in unit testing, validation, and deployment of data pipelines
  • Optimize data processing performance and troubleshoot production issues
  • Adopt modern tools (e.g., AI-assisted tools like Claude) to improve productivity, reduce errors, and enhance development workflows
  • Maintain proper documentation for data flows, mappings, and processes

Required Skills & Qualifications

  • Strong experience in PySpark for large-scale data processing
  • Hands-on experience with Informatica BDM (Big Data Management)
  • Solid understanding of ETL concepts, data warehousing, and data modeling
  • Experience with data profiling, data mapping, and impact analysis
  • Knowledge of IFRS9 or Risk/Banking domain is highly preferred
  • Familiarity with distributed data processing frameworks and big data ecosystems
  • Strong SQL skills and experience working with relational databases
  • Good understanding of data quality and governance principles

Preferred Skills

  • Exposure to cloud platforms (AWS / Azure / GCP)
  • Experience with AI-assisted development tools (e.g., Claude, GitHub Copilot)
  • Knowledge of CI/CD pipelines in data engineering workflows

Soft Skills

  • Strong analytical and problem-solving skills
  • Excellent communication and stakeholder management abilities
  • Ability to work in a fast-paced, collaborative environment

More Info

Job Type:
Industry:
Employment Type:

Job ID: 145418313