The Role
We are hiring a Senior Data Engineer to join our team within the Data Architecture & Engineering practice. You will design, build, and maintain data pipelines that power healthcare payer analytics - including claims processing, contract configuration, pricing edits, and fee schedule ingestion. This is hands-on engineering work in a Snowflake and AWS environment, not a management role.
Work You'll Do
- Design, develop, and maintain scalable data pipelines in Snowflake for healthcare claims, contract configuration, and fee-for-service reimbursement logic
- Build and optimize ELT processes using Snowflake SQL, stored procedures, and dynamic SQL for complex healthcare datasets
- Develop and maintain Apache Airflow DAGs (AWS MWAA) for pipeline orchestration, scheduling, and monitoring across multiple data domains
- Write clean, testable Python scripts for data transformation, validation, web scraping automation, and API integrations
- Implement and maintain CI/CD workflows using Bitbucket Pipelines and Liquibase for database migration and schema change management
- Support data quality and governance processes, including Snowflake role-based access control, dynamic data masking, and schema patterns
- Monitor pipeline health, troubleshoot failures, and resolve incidents across Snowflake, AWS Lambda, S3, SNS, and CloudWatch
- Leverage AI coding assistants to accelerate development, with a focus on review, refinement, and quality assurance of AI-generated output
- Collaborate with business analysts and project teams to translate healthcare business requirements into technical pipeline specifications
- Document technical processes, pipeline architecture, and operational runbooks in Confluence
Requirements
- 5+ years of hands-on data engineering experience
- Proficiency in Snowflake: ELT pipeline development, stored procedures, query optimization, data modeling (star, snowflake, and hybrid schemas)
- Strong SQL skills, including complex joins, window functions, CTEs, and dynamic SQL generation
- Solid Python experience in a data engineering context (Pandas, file processing, API clients, web scraping with BeautifulSoup or Scrapy)
- Experience with Apache Airflow for pipeline orchestration (AWS Managed Workflows for Apache Airflow preferred)
- Hands-on experience with AWS services: S3, Lambda, SNS, CloudWatch, IAM
- Proficiency with Git-based version control and CI/CD pipelines (Bitbucket Pipelines preferred)
- Experience with database migration tooling (Liquibase preferred)
- Demonstrated ability to work independently and manage priorities across concurrent workstreams)
- Strong written and verbal communication skills in English, with the ability to collaborate effectively across time zones
- Experience working in globally distributed teams
Preferred Qualifications
- Healthcare payer domain experience: claims data, provider contracts, fee schedules, or pricing analytics
- Experience with data governance frameworks, HIPAA compliance, and PHI handling in cloud environments
- Familiarity with Power BI for data visualization and reporting
- Experience with AI-assisted development tools (Claude Code, GitHub Copilot, or similar)
- Familiarity with Terraform or infrastructure-as-code practices
- Unit testing and test-driven development practices for data pipelines
- Agile delivery experience (Scrum or Kanban), including sprint planning and backlog management in Jira
- Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent practical experience)