Search by job, company or skills

caliberfocus, inc

Data Engineer

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 23 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Experience: 4 – 8 Years

Department: Data & Analytics

Domain: Healthcare Product

Job Summary

We are seeking a highly skilled Data Engineer with a strong foundation in Data Engineering concepts and rich implementation experience in Healthcare Products and healthcare data ecosystems. The ideal candidate will design, develop, and optimize scalable data solutions, enabling data-driven decision-making across the organization. This role requires hands-on experience in modern data architectures, cloud platforms, ETL/ELT processes, and healthcare data workflows to support scalable and compliant healthcare solutions.

Key Responsibilities

  • Design, build, and maintain scalable ETL/ELT pipelines and data workflows.
  • Develop and manage data lakes, lakehouses, and data warehouses.
  • Implement data solutions using AWS, Azure, Microsoft Fabric, and Snowflake.
  • Integrate data from multiple sources including APIs, databases, and third-party systems.
  • Optimize data performance, reliability, and cost in cloud environments.
  • Ensure data quality, security, and governance standards.
  • Collaborate with data analysts, scientists, and business stakeholders.
  • Automate and orchestrate workflows using modern data engineering tools.
  • Monitor, troubleshoot, and document data pipelines and architectures.
  • Follow Agile, DevOps, and DataOps best practices.
  • Work closely with healthcare product teams to support healthcare data integration, analytics, and reporting requirements.

Required Technical Skills

  • Programming & Querying: Python, SQL, PySpark
  • Cloud Platforms: Rich experience in at least one cloud platform.
  • AWS: S3, Glue, Redshift, Lambda, Athena
  • Azure: Azure Data Factory, Azure Databricks, Azure Synapse, ADLS
  • Microsoft Fabric: OneLake, Data Factory, Lakehouse, Warehouse, Power BI
  • Data Warehouse: Snowflake (Snowpipe, Streams, Tasks, Time Travel)
  • Tools & Technologies: Apache Spark, Databricks, Airflow, dbt
  • Databases: SQL Server, PostgreSQL, MySQL, Oracle
  • DevOps & Version Control: Git, Azure DevOps, CI/CD
  • Domain: Strong knowledge in Healthcare Products, healthcare workflows, and healthcare data standards.

Preferred Qualifications

  • Bachelor's/Master's degree in Computer Science, IT, or a related field.
  • Experience with real-time data processing using Kafka or Kinesis.
  • Knowledge of Medallion Architecture, Delta Lake, and Lakehouse concepts.
  • Familiarity with Power BI and data visualization tools.
  • Experience with Infrastructure as Code (Terraform, ARM templates).
  • Exposure to healthcare systems, EHR/EMR platforms, or healthcare analytics projects is an added advantage.

Certifications (Preferred)

  • Microsoft Certified: Azure Data Engineer Associate (DP-203)
  • Microsoft Fabric Analytics Engineer Associate (DP-600)
  • AWS Certified Data Engineer – Associate
  • Snowflake SnowPro Core Certification
  • Databricks Certified Data Engineer Associate

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147319327

Similar Jobs

Chennai, India

Skills:

T-sqlPysparkScalaAWS GlueSQL ServerSqlDatabricksAWS CodePipelinePhoton EnginePythonAws S3GitHub ActionsLiquid ClusteringDelta Lake

Chennai, India

Skills:

PysparkRedshiftPythonSqlGlue

Coimbatore, Chennai, Pune

Skills:

snowflake CortexSqlEtlsnowflake architecture

Early Applicant
Chennai, India

Skills:

snowflake SqlPythonAirflowdbt

Chennai, India

Skills:

distributed architecture object storage KshLINUXSqlELTPythonEtlAirflowCFTStarburstTrino