Search by job, company or skills

I

Data Engineer – Kafka & Spark

Save
new job description bg glownew job description bg glow
  • Posted 6 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Role: Data Engineer – Kafka & Spark

Location: Gurugram | Delhi NCR

Experience: 6+ years

Immediate Joiners Only

  • DE + Kafka – (6 + Yeas Exp)
  • Primary Tools/Platforms: Data engineering Skills – Any cloud , Kafka , Streaming data processing experience
  • - Language: SQL, PySpark
  • - Expertise in SQL, should be able to write complex SQLs to do data analysis
  • - SQL: Aggregation, Windows functions, Joins, Performance
  • - Spark: Expertise in spark architecture and processing. Should be able to write performance optimized code
  • - Data modeling: Experience in ER model and Dimensional modeling (Fact/Dimension)
  • - ETL: Should understand standard ETL processes, like - SCD, Delta load etc.
  • - Experience in design and development of end-to-end data pipeline

  • Good to have Data Visualization tool knowledge e.g., PowerBI or Tableau.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147480145