- Role: Senior Data Integration & API Engineer
- Experience: 7-10 Years
- Location: Remote
- Domain: Banking and Finance
- Skills Required: SQL, Data Integration, REST APIs, Data Transformation, ODS/Data Warehousing, Batch Processing, Python, Snowflake, Git, Data Warehousing.
- Immediate joiners only
- Good Communication
About the Role
We are looking for a Senior Data Engineer (Data Integration & APIs) with 6–10 years of experience to design and build scalable data pipelines and integration solutions. This role focuses on customer and account data management, business rule implementation, and API/file-based system integrations.
Key Responsibilities
- Design and build end-to-end data integration workflows across multiple systems
- Develop and optimize scalable data pipelines using SQL and Python
- Work with modern data platforms like Snowflake / Databricks
- Implement data transformation and aggregation logic for reusable datasets
- Build and integrate REST APIs and SFTP-based file solutions (JSON/XML)
- Apply business rules for account eligibility, validation, and processing
- Handle edge cases (inactive accounts, duplicates, zero-balance scenarios)
- Ensure proper workflow sequencing and dependency management
- Perform testing, logging, error handling, and reprocessing
- Collaborate with cross-functional teams and maintain documentation
Requirements
- 6–10 years of experience in Data Engineering / Data Integration / Backend Engineering
- Strong expertise in SQL (advanced) and Python
- Hands-on experience with Snowflake or Databricks
- Experience with REST APIs, SFTP, JSON/XML integrations
- Solid understanding of data warehousing (ODS), batch processing
- Experience handling large-scale data pipelines and multi-system integrations
- Strong problem-solving and analytical skills
Good to Have
- Experience in banking / financial services domain
- Exposure to ETL tools like Informatica, Talend, SSIS
- Familiarity with customer/account data models
- Experience with core banking platforms (e.g., Fiserv)