Search by job, company or skills

StackNexus

Senior Data Engineer

6-9 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted 13 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Experience

Job Summary for Senior Data Engineer (List Format):

  • 6–9 years of relevant experience in data engineering or related roles.

Key Responsibilities

  • Design, develop, and maintain scalable ETL/ELT pipelines for structured and semi-structured data.
  • Build robust data ingestion frameworks from APIs, databases, SaaS platforms, logs, and telemetry systems.
  • Develop and optimize data warehouses and data lakes for high performance and scalability.
  • Implement data transformation, aggregation, and business logic based on business requirements.
  • Ensure data quality, validation, reconciliation, and consistency across systems.
  • Monitor data pipelines for performance, failures, and data freshness; implement alerting and observability.
  • Collaborate with data analysts, data scientists, and product teams to deliver high-quality datasets.
  • Implement security, access control, and compliance best practices for data platforms.
  • Document data models, pipelines, and operational processes for maintainability and governance.
  • Support downstream use cases such as BI dashboards, reporting, billing systems, and machine learning pipelines.
  • Continuously improve data engineering practices, performance, and reliability.
  • Participate in architecture discussions and help define scalable data platform strategies.

Required Skills

  • Strong programming skills in Python and SQL.
  • Hands-on experience with ETL/ELT frameworks (e.g., Airflow, dbt, Spark).
  • Expertise in relational databases and modern data warehouses.
  • Strong understanding of data modelling (star/snowflake schemas), indexing, and query optimization.
  • Experience building batch and/or streaming data pipelines.
  • Experience with cloud platforms (AWS, Azure, or GCP).
  • Proficiency with object storage systems (S3, GCS, ADLS).
  • Experience with data formats such as Parquet, Avro, and JSON.
  • Familiarity with APIs and data integration patterns.

Preferred/Additional Skills

  • Experience with real-time or telemetry data pipelines.
  • Exposure to data reconciliation, billing systems, or financial datasets.
  • Experience with data platforms such as Snowflake, BigQuery, Redshift, Incorta.
  • Knowledge of CI/CD pipelines for data engineering workflows.
  • Understanding of data governance, lineage, and metadata management.
  • Experience supporting high-scale or enterprise-grade systems.
  • Familiarity with distributed data processing and performance tuning.

Additional Qualifications

  • Proven experience in designing and implementing scalable data pipelines and platforms.
  • Strong problem-solving and analytical skills.
  • Experience in cross-functional, agile teams.
  • Excellent communication and documentation skills.

Work Details

  • Hybrid work model: Minimum 2 days in office (Tuesday and Thursday); additional days as needed.
  • Location: Pune, Maharashtra, India.
  • Normal business hours with flexibility for cross-time-zone collaboration.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 145618035