Search by job, company or skills

  • Posted 7 hours ago
  • Be among the first 20 applicants
Early Applicant
Quick Apply

Job Description

Job Description Python Developer (Apache Beam)

Role: Python Developer Data Engineering

Experience: 59 Years

Location: Pune / Hyderabad / Bangalore

Role Purpose

The Python Developer will design, build, and maintain scalable data processing solutions using Python and Apache Beam.

The role focuses on developing robust batch and streaming pipelines, ensuring data quality, performance, security, and compliance in a regulated banking environment.

Key Responsibilities

  • Design and develop data pipelines using Python and Apache Beam for batch and streaming workloads.
  • Build scalable and faulttolerant data processing solutions on cloud platforms (GCP preferred).
  • Develop reusable, maintainable, and efficient Python code following HSBC coding standards.
  • Integrate data pipelines with multiple data sources (files, databases, APIs, messaging systems).
  • Optimize data pipelines for performance, cost, and reliability.
  • Implement data validation, error handling, logging, and monitoring.
  • Ensure compliance with HSBC security, governance, and data privacy standards.
  • Support CI/CD automation for data pipelines.
  • Collaborate with architects, product owners, QA, and DevOps teams.
  • Participate in Agile ceremonies and contribute to sprint deliverables.

Mandatory Skills

  • Strong handson experience in Python (3.x).
  • Handson experience with Apache Beam (batch and/or streaming pipelines).
  • Strong understanding of ETL / ELT concepts and data processing frameworks.
  • Experience with SQL and relational databases.
  • Experience with cloudnative data services (GCP preferred).
  • Solid understanding of data structures, OOP concepts, and design principles.
  • Experience with Git and version control.
  • Exposure to CI/CD pipelines (Jenkins / GitLab / Azure DevOps).
  • Strong debugging, performance tuning, and problemsolving skills.

Preferred / Good to Have Skills

  • Experience with Google Cloud Platform (GCP) services such as BigQuery, Pub/Sub, Dataflow.
  • Experience with streaming platforms (Kafka / Pub/Sub).
  • Knowledge of PySpark / Spark.
  • Experience with Airflow / Composer for orchestration.
  • Familiarity with Docker and Kubernetes.
  • Experience working in banking or financial services domain.
  • Understanding of data governance, lineage, and regulatory controls.

Testing & Quality

  • Experience with unit testing frameworks (PyTest, unittest).
  • Knowledge of data quality checks and validation frameworks.
  • Familiarity with monitoring and alerting tools.

Soft Skills

  • Strong communication and stakeholder interaction skills.
  • Ability to work in a global, multivendor delivery model.
  • Proactive, deliveryfocused, and detailoriented.
  • Comfortable working in a highly regulated environment.

More Info

Job Type:
Function:
Employment Type:
Open to candidates from:
Indian

Job ID: 144408601

Similar Jobs