Role Overview
We are seeking a GCP Data Engineer who lives at the intersection of Cloud Engineering and Healthcare Finance. Your mission is to build the ingestion engine for our RCM platform, handling the high-volume exchange of X12 EDI transactions. You will own the pipeline that turns cryptic EDI files into actionable FHIR resources and BigQuery insights, ensuring every claim is tracked, validated, and compliant.
Key Responsibilities
1. X12 & EDI Pipeline Engineering
- Ingestion: Build serverless SFTP/MFT ingestion points using Cloud Storage triggers and Cloud Functions.
- Parsing & Mapping: Develop and maintain complex mapping logic to convert X12 (837P/I, 835, 270/271) files into JSON/FHIR using Cloud Dataflow (Apache Beam) or custom Python parsers.
- Validation: Implement X12 syntax validation and snip level checks to ensure claims are clean before they hit the payer gateway.
2. Healthcare Data Interoperability
- FHIR Integration: Utilize the Google Cloud Healthcare API to store and manage clinical/financial data.
- Data Harmonization: Map X12 Loops and Segments (like NM1, CLM, REF) to FHIR Resources (Claim, ExplanationOfBenefit, Patient).
- Code Normalization: Use BigQuery to map legacy internal codes to standard healthcare terminologies (ICD-10, CPT, HCPCS, NPI).
3. High-Tech Data Stack (Serverless)
- Lakehouse: Architect a BigQuery-based data lakehouse to store raw EDI, transformed FHIR, and aggregated financial metrics.
- Orchestration: Use Cloud Composer (Airflow) to manage the multi-step dependencies of the daily Remit (835) processing cycle.
- DLP & Masking: Ensure HIPAA compliance by automating the discovery and masking of PII/PHI in non-production environments using Cloud Sensitive Data Protection (DLP).
Technical Requirements
- X12 Mastery: Deep understanding of EDI standards, loops, segments, and elements.
- GCP Data Stack: 5+ years of hands-on experience with BigQuery, Dataflow, Pub/Sub, and Cloud Healthcare API.
- Coding: Expert-level Python (for custom parsing) and SQL (for complex financial reconciliation).
- Compliance: Proven experience building HIPAA-compliant pipelines; understanding of BAA and technical safeguards.
- DevOps for Data: Experience with dbt (data build tool) and Terraform for managing data infrastructure.
- Governance Tooling: Hands-on experience with Dataplex, Cloud Data Loss Prevention (DLP) for PII discovery, and Object Lifecycle Management (OLM) for cost-optimized, compliant data aging.
- Rule Design: Ability to translate medical billing business rules (e.g., CPT-ICD10 code pairing) into automated SQL-based quality checks.)
Shift: 12:00 PM 9:00 PM IST
Why Join Us:
We trust our people and offer completely remote opportunities.
Flexible work schedules for better work-life balance.
Group of 580+ Agile, Smart and Dynamic IT Professionals.
Supportive and collaborative work environment.
5 days working company (Monday - Friday). All weekends are Off!
Great working and learning environment
Company-Sponsored Insurance!