Who We Are
Welcome to TELUS Digital — where innovation drives impact at a global scale. As an award-winning digital product consultancy and the digital division of TELUS, one of Canada's largest telecommunications providers, we design and deliver transformative customer experiences through cutting-edge technology, agile thinking, and a people-first culture.
With a global team across North America, South America, Central America, Europe, Africa, and APAC, we offer end-to-end expertise across various service offerings: Web, Mobile & Digital Marketing | Enterprise AI | Customer Care AI & Technology | Enterprise Technology Modernization
From mobile apps and websites to voice UI, chatbots, AI, customer service, and in-store solutions, TELUS Digital enables seamless, trusted, and digitally powered experiences that meet customers wherever they are — all backed by the secure infrastructure and scale of our multi-billion-dollar parent company.
Location & Flexibility
India Hybrid: Noida/Bangalore/Hyderabad
This is a hybrid role. This model requires the ability to work in a hybrid mode from our office in Noida/Bengaluru (1 to 2 times/ week). Our office culture is designed to foster in-person innovation, collaboration, and connection with team members, local and visiting, from other global offices.
The Opportunity
As an Applications Development Module Lead, you will use your technical knowledge to implement ideas of process automation and be responsible for performing web testing of different phases of client implementation life cycle on an ongoing basis.
Responsibilities
- The TELUS Health Data Platform (THDP), part of the TELUS AI Office, is responsible for transforming healthcare data integration through cutting-edge streaming technologies and FHIR standardization.
- TELUS is revolutionizing healthcare by connecting patients and providers through our award-winning Collaborative Health Record (CHR), an end-to-end, secure, cloud-based digital platform. Our FHIR4 Extract Pipeline serves as the backbone for batch and real-time healthcare data transformation, converting diverse healthcare data sources into standardized FHIR R4 resources. This enables seamless interoperability across our growing portfolio of 2,000+ healthcare providers across Canada, the United States, New Zealand, and Australia.
- Transform healthcare data in real-time with cutting-edge streaming technology. Join our Healthcare Data Engineering team.
- Our Healthcare Data Engineering team is at the forefront of healthcare data interoperability, working with Apache Pyspark, Apache Flink, FHIR R4 standards, and modern streaming architectures. We're building the next generation of healthcare data pipelines that process millions of clinical records in real-time, transforming diverse EMR data into standardized FHIR resources that enable coordinated patient care and clinical insights.
- As the newest member of our team, you will work on the the FHIR4 Extract Pipeline - a sophisticated batch/streaming data processing framework that transforms healthcare data from sources like MedAccess EMR, laboratory systems, and clinical devices into FHIR R4 compliant resources. You'll collaborate with healthcare IT professionals, data scientists, and clinical experts to ensure our pipeline meets the demanding requirements of real-time healthcare data processing.
- Here's the impact you'll make and what we'll accomplish together As a Healthcare Data Engineer on the FHIR4 Extract Pipeline team, you will design, implement, and optimize batch and streaming data transformations that convert complex healthcare data into standardized FHIR R4 resources. Your work will directly enable real-time clinical decision support, population health management, and seamless data exchange across healthcare systems. You'll be working with cutting-edge technologies including Apache Pyspark, Apache Flink, Ibis dataframes, Kafka streaming, and Kubernetes orchestration to build scalable, reliable healthcare data pipelines.
- Here's how Developing and maintaining Pyspark and Ibis-based transform functions to convert healthcare data from TELUS's suite of EMRs into standardized FHIR R4 resources (Patient, Encounter, Observation, MedicationRequest, etc.)
- Optimizing Apache Flink streaming jobs for high-volume healthcare data processing with low latency requirements.
- Collaborating with healthcare domain experts to ensure FHIR compliance and clinical data accuracy.
- Working with Kafka streaming infrastructure to handle real-time data ingestion and distribution.
- Implementing comprehensive testing strategies including unit tests, FHIR validation, and data consistency checks.
- Contributing to schema management and evolution strategies for healthcare data transformation.
- You're the missing piece of the puzzle.
- Strong experience with Apache Pyspark and/or Flink or similar distributed stream processing frameworks (Kafka Streams, Apache Storm).
- Knowledge of FHIR R4 standards and healthcare data interoperability concepts.
- Experience with Python data processing libraries (Pandas, PyArrow) and dataframe abstractions like Ibis.
- Proficiency in SQL and understanding of complex data transformations and joins Experience with Apache Kafka for real-time data streaming and event-driven architectures.
- Hands-on experience with containerization (Docker) and Kubernetes orchestration Understanding of healthcare data sources (EMRs, laboratory systems, clinical devices) and clinical workflows.
- Experience with Google Cloud Platform services, particularly BigQuery, Cloud Storage, and GKE.
- Strong analytical and problem-solving skills with attention to data quality and clinical accuracy.
- Experience with modern development practices including Git, CI/CD, and automated testing.
- Ability to work independently and collaboratively in a fast-paced healthcare technology environment. Great-to-haves
- Experience with PyFlink (Python API for Apache Flink) and Flink SQL
- Knowledge of FHIR/HL7 standards, clinical terminologies (SNOMED, LOINC, ICD-10), and healthcare data governance.
- Experience with Pixi environment management and modern Python dependency management.
- Familiarity with Apache Iceberg or other modern data lake storage formats.
- Experience with Flink Kubernetes Operator and cloud-native deployment patterns.
- Knowledge of healthcare compliance requirements (HIPAA, PIPEDA) and data privacy considerations.
- Experience with monitoring and observability tools for distributed systems (Prometheus, Grafana).
- Background in clinical informatics, health information management, or biomedical engineering.
- Experience with EMR systems like MedAccess, Epic, or Cerner.
- Familiarity with Palantir Foundry, Databricks or similar data integration platforms
- Understanding of real-time vs. batch processing trade-offs in healthcare data systems
- Evaluation skillset: PySpark, Python, SQL, Pub-Sub etc., Cloud AWS or GCP.
Qualifications
- Bachelor's degree in Computer Science or related technical field, or equivalent practical experience.
Bonus Points
- Candidate have gcp certification.
Working Knowledge/Certification in GCP
Equal Opportunity Employer
At TELUS Digital, we are proud to be an equal opportunity employer and are committed to creating a diverse and inclusive workplace. All aspects of employment, including the decision to hire and promote, are based on applicants qualifications, merits, competence and performance without regard to any characteristic related to diversity.
We will only use the information you provide to process your application and to produce tracking statistics. Since we do not request personal data deemed sensitive, we ask you to abstain from sharing that information with us.
For more information on how we use your information, see our Privacy Policy.