
Search by job, company or skills
About the Role
The Data Platform team at Chargebee builds and maintains scalable data systems that power internal analytics, business intelligence, and customer-facing data features.
As a Data Engineer, you will work on building and maintaining reliable data pipelines across multiple layers of the data platform, including data ingestion, distributed processing, transformation, and data serving.
You will collaborate closely with product engineers, analysts, and platform teams to ensure that data is ingested, processed, and made available efficiently for analytics and product use cases. This role provides opportunities to work with modern data technologies and distributed systems, solving real-world data challenges at scale.
The team operates in a fast-paced and collaborative environment, building reliable and scalable infrastructure that supports Chargebee's growing data needs.
What You Will Work On
As a Data Engineer, you will contribute to the development and evolution of Chargebee's data platform. This includes working on systems responsible for ingesting large volumes of data, processing and transforming it using distributed systems like Apache Spark, and making it available for analytics and customer-facing products.
The role provides exposure to:
Key Responsibilities
Minimum Qualifications
Good-to-Have Qualifications
Job ID: 147514107
Skills:
Apache Airflow, Hadoop, Spark, Apache Beam, Python, Google DataProc, dbt, Google DataFlow, Google Cloud Ecosystem, Google Cloud Storage
Skills:
Github, MLops, Amazon Redshift, Pyspark, Data Warehousing, Python, Sql, Airflow DAGs
Skills:
Data Modeling, Pyspark, ELT, Python, Spark SQL, Sql, Azure Data Factory, Data Warehousing, Etl, cdc, OneLake, Warehouse, Lakehouse Architecture, Azure Data Lake Storage Gen2, Azure Monitoring, Microsoft Fabric Data Pipelines, Microsoft Purview, watermarking techniques, Microsoft Azure Fabric, Incremental loads, Azure Blob Storage, Lakehouse, Stored Procedures, performance optimization, Pipeline monitoring
Skills:
Data Warehousing Concepts, Google Cloud Platform Architecture, ETL processes, Python Programming Language, Data quality frameworks, Data integration tools
Skills:
snowflake , Storm, Cassandra, Kafka, Tableau, Nosql, Data Governance, Python, AWS, Java, Hadoop, Power Bi, Scala, Impala, Sql, Pig, Data Quality, Hive, Gcp, Spark, Databricks, Azure, Kubernetes, Hbase, Map Reduce
We don’t charge any money for job offers