Search by job, company or skills

T

Big Data Engineer (Scala, Kafka, Docker, Kubernetes)

4-5 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted a month ago
  • Over 50 applicants
Quick Apply

Job Description

Key Responsibilities:

  • Design and implement scalable, fault-tolerant data pipelines using Scala, Kafka, and other Big Data technologies.
  • Build real-time data streaming applications to process high-volume datasets from various sources.
  • Integrate and manage containerized applications using Docker and orchestrate deployments with Kubernetes.
  • Work with distributed data processing frameworks such as Apache Spark, Flink, or Kafka Streams.
  • Optimize data ingestion, transformation, and storage processes for performance and reliability.
  • Collaborate with data scientists, analysts, and backend engineers to meet analytical and operational needs.
  • Monitor and troubleshoot production issues in real time, ensuring high availability and low latency.
  • Implement data quality, observability, and monitoring best practices across the pipeline.

Qualifications and Requirements:

  • Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field.
  • 4+ years of experience in Big Data and streaming platform development.
  • Proficient in Scala (required); knowledge of Java or Python is a plus.
  • Hands-on experience with Apache Kafka (Kafka Connect, Kafka Streams, or Confluent Platform).
  • Strong experience in Docker containerization and Kubernetes orchestration.
  • Familiarity with Big Data frameworks: Apache Spark, Hadoop, or Flink.
  • Working knowledge of CI/CD pipelines, Git, and monitoring tools (e.g., Prometheus, Grafana).
  • Experience with both batch and stream processing architectures.

Desirable Skills and Certifications:

  • Experience with cloud data platforms (e.g., AWS EMR, GCP Dataflow, Azure Synapse).
  • Understanding of data lake, data warehouse, and data mesh principles.
  • Familiarity with NoSQL databases (e.g., Cassandra, HBase, or MongoDB).
  • Certifications like Confluent Certified Developer, CKA/CKAD, or Databricks Spark Developer.
  • Knowledge of security and compliance in Big Data environments (encryption, RBAC, auditing).

More Info

Job Type:
Function:
Employment Type:
Open to candidates from:
Indian

About Company

Teamware Solutions, a business division of Quantum Leap Consulting Private Limited, offers cutting edge industry solutions for deriving business value for our clients' staffing initiatives. Offering deep domain expertise in Banking, Financial Services and Insurance, Oil and Gas, Infrastructure, Manufacturing, Retail, Telecom and Healthcare industries, Teamware leads its service in offering skills augmentation and professional consulting services.

Job ID: 121754571