Search by job, company or skills

Quorum Software

Senior Data Engineer (Hybrid Work Schedule)

4-6 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted a day ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Senior Data Engineer

Location: Pune, India

Model of Work: Hybrid

About Quorum Software

Quorum Software connects people and information across the energy value chain. Twenty years ago, we built the first software for gas plant accountants. Pipeline operators came next, followed by land administrators, pumpers, and planners. Since 1998, Quorum has helped thousands of energy workers with business workflows that optimize profitability and growth. Our vision for the future connects the global energy ecosystem through cloud-first software, data standards, and integration. The trusted source of decision-ready data for 1,800+ companies, Quorum Software makes the essential connections that let us work better together in the connected energy workplace. For more information, visit quorumsoftware.com

Be a part of our legacy

Quorum Software is the world's largest provider of digital technology focused solely on business workflows that empower the next evolution of energy. From emerging companies to supermajors, throughout every region of the globe, customers rely on Quorum's proven innovation and unmatched global expertise to streamline business operations and make data-driven decisions that optimize profitability and growth. Our industry-leading solutions are transforming energy companies across the entire value chain, helping visionary leaders evolve their organizations into modern energy companies.

Who We Are Looking For

Are you excited by challenges Do you enjoy working in a fast-paced, international and dynamic environment Then now is the time to join Quorum Software, a rapidly growing company and industry leader in oil & gas transformation.

Job Purpose

We are looking for a Senior Data Engineer to join our data team and play a key hands-on role in building and scaling our enterprise data platform. You will work closely with the Data Engineer Manager and cross-functional teams to design and deliver high-quality data pipelines, lakehouse infrastructure, and APIs that power analytics, reporting, and machine learning across multiple business segments.

You are a strong individual contributor who takes ownership of complex technical problems, brings opinions to architecture discussions, and raises the quality bar of everything you touch. You're not just executing tickets — you're helping shape how data moves, transforms, and gets consumed across the organization.

What You Will Do

  • Design, build, and maintain scalable data pipelines that move and transform data across Bronze, Silver, and Gold layers within a Medallion Architecture
  • Develop and optimize data workflows within Databricks, including Delta Lake tables, Databricks Workflows, and Unity Catalog
  • Build RESTful APIs that expose curated data assets to downstream consumers including analysts, applications, and external partners
  • Write high-quality, well-tested SQL and PySpark transformations that are reliable, efficient, and easy to maintain
  • Contribute to pipeline monitoring, data quality frameworks, and alerting to ensure SLAs are met consistently
  • Collaborate with Data SME's to support feature pipelines, experiment tracking, and the operationalization of ML models
  • Participate in architecture and design reviews, bringing practical experience and a critical eye to technical decisions
  • Mentor junior engineers through code reviews, pairing sessions, and knowledge sharing
  • Work with Azure data services to manage storage, orchestration, security, and infrastructure
  • Collaborate with our Product leaders to design and create
  • And other duties as assigned.

What To Bring

Data Engineering Fundamentals

  • 4+ years of hands-on data engineering experience in a production environment
  • Strong understanding of data pipeline design patterns including incremental loading, idempotency, schema evolution, and error handling
  • Experience working across multiple ingestion methods — APIs, CDC, file-based, batch, and streaming

Databricks & Lakehouse

  • Solid hands-on experience with Databricks including Delta Lake, Unity Catalog, and Databricks Workflows
  • Working knowledge of Medallion Architecture and how to structure Bronze, Silver, and Gold layers for different data domains
  • Proficiency in PySpark and Spark SQL for large-scale data processing and transformation

Data Pipelines & Orchestration

  • Experience building and maintaining production pipelines using orchestration tools such as Apache Airflow, Databricks Workflows, or Azure Data Factory
  • Comfortable debugging and optimizing slow or failing pipelines in a production environment
  • Familiarity with CI/CD practices for data pipelines including version control, automated testing, and deployment pipelines

API Development

  • Experience designing and building RESTful APIs to serve data to downstream consumers
  • Proficiency with Python-based API frameworks such as FastAPI or Flask
  • Understanding of API best practices including authentication, error handling, versioning, and documentation

SQL & Azure

  • Strong SQL skills including window functions, CTEs, complex joins, and performance tuning
  • Hands-on experience with Azure data services: Azure Data Lake Storage Gen2, Azure Data Factory, Azure Synapse Analytics, Azure Purview, and Azure Key Vault
  • Comfortable working within Azure DevOps for source control, pipelines, and release management

Data Science Collaboration

  • Good foundational understanding of data science workflows — feature engineering, model training, and experiment tracking
  • Experience supporting MLflow-based workflows within Databricks including experiment logging and model registry
  • Able to build and maintain feature pipelines that meet the reliability and freshness requirements of ML models

Nice To Have

  • Experience with Microsoft Fabric and Microsoft OneLake
  • Experience in the Oil & Gas or energy industry, particularly with operational, production, or field data
  • Familiarity with dbt for analytics engineering and modular SQL transformation workflows
  • Exposure to streaming data patterns using Kafka, MQTT, or Azure Event Hubs
  • Experience with infrastructure-as-code tools such as Terraform or Bicep on Azure
  • Familiarity with data governance concepts including lineage, data cataloging, and access control

About You

You care deeply about the quality and reliability of the data you build. You're the kind of engineer who asks what happens when this fails before something ships, and why does this exist before adding complexity. You work well independently but thrive in a collaborative team environment. You're ready to be a technical anchor on a high-impact platform — and you're excited to grow into broader technical leadership over time.

What Success Looks Like In Year One

  • You own and have delivered several production pipelines end-to-end, with monitoring and documentation in place
  • Your code is well-regarded in reviews — clean, tested, and built to last
  • You've shipped at least one API that is actively consumed by a downstream team
  • Data Scientists and Product owners are getting what they need faster because of pipelines you built or improved
  • You've made junior engineers around you measurably better through pairing and review

Additional Details

  • Visa Sponsorship: Employment eligibility to work with Quorum Software in India is required as the company will not pursue visa sponsorship for this position.

Quorum Diversity Statement: At Quorum, we are committed to fostering, cultivating, and preserving a culture of belonging. We want to be the place where a diverse pool of talented people join us, stay with us and do their best work. With a diverse team of employees, we grow and learn better together. The collective sum of the individual differences, life experiences, knowledge, innovation, self-expression, and talent that our employees invest in their work represents not only part of our culture, but our reputation and our achievements. We are fully focused on equity and equality and believe deeply in diversity of race, gender, sexual orientation, age, religion, ethnicity, national origin, ability, neurodiversity and all the other characteristics that make us unique.

Quorum Business Solutions and Quorum Software are Equal Opportunity Employers. All qualified applicants will receive consideration for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, ancestry, veteran status, disability, genetic information, or any other basis protected by law.

Those applicants requiring reasonable accommodation to the application and/or interview process should notify a member of the Human Resources Department

Recruitment Scam Alert: Quorum Software does not charge fees, request payments, conduct interviews via messaging apps, or request the installation of software at any stage of the recruitment process. All legitimate recruitment activities are conducted exclusively through our official careers website (www.quorumsoftware.com/careers) and email addresses ending in @quorumsoftware.com. Any communication that does not originate from these official channels should be considered unauthorized and may be reported to [email protected]

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 145772471

Similar Jobs