Search by job, company or skills

T

Hadoop Developer

4-6 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted a month ago
  • Be among the first 30 applicants
Early Applicant
Quick Apply

Job Description

Key Responsibilities:

  • Develop, test, and deploy Hadoop-based data processing workflows using tools like MapReduce, Hive, Pig, and Spark.
  • Design and implement ETL/ELT pipelines to ingest and process large volumes of structured and unstructured data.
  • Write efficient Hive queries, optimize MapReduce jobs, and develop Spark applications using Scala, Java, or Python.
  • Work with HDFS for storage management and data ingestion strategies.
  • Collaborate with data architects, analysts, and business stakeholders to understand data requirements and translate them into technical solutions.
  • Monitor and troubleshoot Hadoop jobs and cluster performance issues.
  • Ensure data quality, data governance, and security compliance in big data solutions.
  • Maintain documentation for code, processes, and workflows.
  • Participate in code reviews, testing, and deployment activities.

Qualifications and Requirements:

  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • 3+ years of experience as a Hadoop Developer or Big Data Engineer.
  • Strong experience with Hadoop ecosystem components such as HDFS, MapReduce, Hive, Pig, HBase, Oozie, Sqoop, and Flume.
  • Proficient in programming languages such as Java, Scala, or Python for developing big data applications.
  • Experience with Apache Spark for batch and stream processing is highly desirable.
  • Familiarity with data modeling, schema design, and query optimization techniques in big data environments.
  • Knowledge of Linux/Unix systems and shell scripting.
  • Experience working with cloud-based big data platforms (AWS EMR, Azure HDInsight, Google Dataproc) is a plus.
  • Good problem-solving skills and ability to work in a collaborative Agile environment.

Desirable Skills:

  • Experience with real-time data streaming tools like Kafka or Storm.
  • Knowledge of NoSQL databases such as HBase, Cassandra, or MongoDB.
  • Familiarity with DevOps and CI/CD pipelines for big data workflows.
  • Understanding of data security and privacy best practices in big data environments.
  • Excellent communication and teamwork skills.

More Info

Job Type:
Function:
Employment Type:
Open to candidates from:
Indian

About Company

Teamware Solutions, a business division of Quantum Leap Consulting Private Limited, offers cutting edge industry solutions for deriving business value for our clients' staffing initiatives. Offering deep domain expertise in Banking, Financial Services and Insurance, Oil and Gas, Infrastructure, Manufacturing, Retail, Telecom and Healthcare industries, Teamware leads its service in offering skills augmentation and professional consulting services.

Job ID: 121755945