
Search by job, company or skills
Data Engineer (Python / PySpark / SQL / AI)
Are you a forward-thinking Data Engineer with a passion for exploring uncharted technological territories Join us in transforming how businesses leverage data through innovative AI applications and cutting-edge solutions. We're seeking a curious mind who can connect dots across complex data landscapes and drive pioneering initiatives that others haven't yet imagined.
As a Data Engineer on our team, you'll be at the intersection of data infrastructure and artificial intelligence, creating pathways for AI-driven insights from diverse data sources, including SAP systems. This role offers the unique opportunity to blaze new trails in how we extract value from enterprise data.
Design and develop robust data pipelines using Python, PySpark, and SQL to support AI and analytics initiatives
Architect and implement scalable data solutions that connect disparate systems, with special focus on SAP data integration
Explore and pioneer new AI use cases leveraging enterprise data that haven't been previously attempted
Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions
Optimize existing data workflows for improved performance, reliability, and scalability
Research and implement innovative approaches to data extraction, transformation, and loading processes
Ensure data quality, governance, and security standards are maintained across all solutions
Document technical processes and knowledge to enable team growth and solution sustainability
You'll join a diverse and dynamic team of SAP application engineers and technical business analysts working in a collocated agile setup. Our globally distributed team operates from Zurich, Bratislava and India, collaborating with stakeholders worldwide to deliver exceptional results.
You're a naturally curious problem-solver who thrives when exploring uncharted territory. You have a talent for quickly grasping complex concepts and connecting seemingly unrelated ideas to create innovative solutions. Your technical expertise is matched by your ability to communicate complex technical concepts clearly and your drive to continuously learn and adapt in a rapidly evolving field.
Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related technical field
Strong proficiency in Python programming with experience building data pipelines and ETL processes
Hands-on experience with PySpark for large-scale data processing
Advanced SQL skills for complex data manipulation and analysis
Demonstrated experience with AI/ML technologies and their practical applications in business contexts
Proven ability to work independently and drive initiatives from concept to implementation
Experience working with SAP systems - S4/HANA and extracting/transforming SAP data
Knowledge of interfaces and their designs
Familiarity with data visualization tools and techniques
Understanding of data governance principles and practices
Previous work in implementing AI solutions in enterprise environments
Experience with real-time data processing frameworks
Our company has a hybrid work model where the expectation is that you will be in the office at least three days per week
Swiss Reinsurance Company Ltd, commonly known as Swiss Re, is a reinsurance company based in Zurich, Switzerland. It is the world's largest reinsurer, as measured by net premiums written
Job ID: 144915949