Search by job, company or skills

Recro

Azure Data Streaming Engineer

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 18 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Role: Azure Data Streaming Engineer

Location: Bangalore, India

Experience: 9+ years

Skills: ADF(Must have), Databricks, Pyspark, SQL, Python, Azure functions, Real- Time Streaming (Must have), Azure IoT Hub (Must have), Data Pipeline, Delta Live table (Good to have), Apache Kafka (Must have), Azure Event Hubs (Must have).

About the Role:

We are looking for an Azure Data Streaming Engineer who lives and breathes high-velocity data. If you are a Data Engineer who thinks like a Software Engineer—prioritizing low latency, event-driven architecture, and clean code over simple batch ETL—this role is for you.

You will be joining our team to architect, build, and maintain mission-critical streaming pipelines. You won't just be moving data from A to B; you will be building the systems that power live tracking, real-time triggers, and high-frequency event processing.

What You Will Do:

  • Architect Real-Time Pipelines: Design and implement robust, scalable, and low-latency data streaming architectures.
  • Build & Optimize: Develop high-performance data processing pipelines using Python, PySpark, and Databricks.
  • Event-Driven Logic: Build complex event-driven workflows (e.g., real-time notifications, IoT telemetry processing, live logistics tracking).
  • Bridge Backend & Data: Collaborate with software engineers to integrate data pipelines directly into application backends.
  • Monitor & Scale: Ensure system reliability, uptime, and performance in a high-concurrency Azure environment.

Must-Have Qualifications:

  • Strong Python Backend Coding: This is not a SQL-only role. You must be comfortable writing production-grade, maintainable Python code.
  • Real-Time/Streaming Expertise: Hands-on experience with streaming datasets and real-time architectures (not limited to batch).
  • Azure Mastery: Deep knowledge of the Azure Data ecosystem, specifically Azure Event Hubs and Azure IoT Hub.
  • Processing Frameworks: Proven experience with PySpark and Databricks.
  • Message Brokers: Professional experience with Apache Kafka or similar distributed streaming platforms.
  • Data Fundamentals: Strong SQL skills for complex data transformation and analytical queries.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147203239