
Search by job, company or skills
• Design, develop, and maintain big data pipelines using Hadoop ecosystem technologies such as HDFS, Hive, and Spark.
• Build scalable data processing solutions using Spark and Python for large-scale data handling.
• Work on Hadoop to AWS migration projects ensuring seamless data transition and performance optimization.
• Develop and maintain Unix shell scripts for automation of data workflows and system processes.
• Collaborate with cross-functional teams to design, develop, and implement data engineering solutions.
• Participate in Agile development cycles including design, development, testing, and deployment activities.
• Ensure data quality, integrity, and performance across distributed data systems.
• Optimize existing big data applications for better performance and scalability.
• Support application design, software development, and testing processes.
• Troubleshoot and resolve issues in big data pipelines and processing systems.
Logicplanet IT Services (India) Pvt. Ltd., incorporated in 2007 and headquartered in Hyderabad, operates as a software publishing, consulting, and IT solutions provider. The company delivers enterprise technology services including software development, digital transformation, and IT staffing solutions. With expertise in areas such as embedded systems, QA automation, ERP, and cloud technologies, Logicplanet supports global clients by combining technical innovation with workforce solutions, positioning itself as both a technology partner and a recruitment facilitator.
Job ID: 147241761
Skills:
S3, Kafka, Tableau, Artifactory, Ec2, Selenium, Oracle, Python, AWS, Java, Spark Streaming, Hadoop, Power Bi, Scala, Jenkins, Git, MS SQL, Hive, DB2, Linux, Unix Shell, Spark, Advanced Sql, Airflow, Chef, Control – M, Apache Hudi, SageMaker

Skills:
snowflake , Scripting, Hadoop, Apache Spark, Kafka, Json, Avro, Apache, Sql, Hive, Sybase Iq, Kubernetes, Python, Parquet, HDFS, Iceberg
Skills:
snowflake , Informatica Cloud, Hadoop, PostgreSQL, Apache Spark, Impala, Sql, Google Cloud, ELT, Jenkins, Git, Hive, Azure, Python, Unix Shell Scripting, Etl, AWS, HDFS, access management, Data encryption

Skills:
snowflake , Apache Spark, Sql, Java, Hadoop, Json, Kafka, Avro, Hive, Python, Parquet, HDFS, Apache Iceberg
We don’t charge any money for job offers