Search by job, company or skills

L

Hadoop Data Engineer

5-8 Years
Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 11 hours ago
  • Be among the first 10 applicants
Early Applicant
Quick Apply

Job Description

• Design, develop, and maintain big data pipelines using Hadoop ecosystem technologies such as HDFS, Hive, and Spark.

• Build scalable data processing solutions using Spark and Python for large-scale data handling.

• Work on Hadoop to AWS migration projects ensuring seamless data transition and performance optimization.

• Develop and maintain Unix shell scripts for automation of data workflows and system processes.

• Collaborate with cross-functional teams to design, develop, and implement data engineering solutions.

• Participate in Agile development cycles including design, development, testing, and deployment activities.

• Ensure data quality, integrity, and performance across distributed data systems.

• Optimize existing big data applications for better performance and scalability.

• Support application design, software development, and testing processes.

• Troubleshoot and resolve issues in big data pipelines and processing systems.

More Info

Job Type:
Function:
Employment Type:
Open to candidates from:
Indian

About Company

Logicplanet IT Services (India) Pvt. Ltd., incorporated in 2007 and headquartered in Hyderabad, operates as a software publishing, consulting, and IT solutions provider. The company delivers enterprise technology services including software development, digital transformation, and IT staffing solutions. With expertise in areas such as embedded systems, QA automation, ERP, and cloud technologies, Logicplanet supports global clients by combining technical innovation with workforce solutions, positioning itself as both a technology partner and a recruitment facilitator.

Job ID: 147241761

Similar Jobs

Bengaluru, India

Skills:

S3KafkaTableauArtifactoryEc2SeleniumOraclePythonAWSJavaSpark StreamingHadoopPower BiScalaJenkinsGitMS SQLHiveDB2LinuxUnix ShellSparkAdvanced SqlAirflowChefControl – MApache HudiSageMaker

Bengaluru, India

Skills:

snowflake ScriptingHadoopApache SparkKafkaJsonAvroApacheSqlHiveSybase IqKubernetesPythonParquetHDFSIceberg

Bengaluru, India

Skills:

snowflake Informatica CloudHadoopPostgreSQLApache SparkImpalaSqlGoogle CloudELTJenkinsGitHiveAzurePythonUnix Shell ScriptingEtlAWSHDFSaccess managementData encryption

Bengaluru, India

Skills:

snowflake Apache SparkSqlJavaHadoopJsonKafkaAvroHivePythonParquetHDFSApache Iceberg