
Search by job, company or skills
• Design, develop, and maintain end-to-end big data pipelines for enterprise applications
• Work on Apache Hadoop and Apache Spark-based data processing frameworks
• Manage and optimize data pipelines, scripts, and production systems for performance and stability
• Perform unit testing of data pipelines and support UAT (User Acceptance Testing) activities
• Debug and resolve issues in existing data pipelines, scripts, and production workflows
• Modify and enhance existing data processing systems based on business requirements
• Review design, code, and deliverables to ensure high-quality output
• Build and maintain scalable and secure big data solutions using cloud platforms
• Ensure implementation of data governance, security, and compliance standards
• Mentor junior team members and contribute to knowledge sharing initiatives
• Participate in Agile development practices and cross-team collaboration
• Handle Proof of Concepts (PoCs) and deliver solutions within timelines
Logicplanet IT Services (India) Pvt. Ltd., incorporated in 2007 and headquartered in Hyderabad, operates as a software publishing, consulting, and IT solutions provider. The company delivers enterprise technology services including software development, digital transformation, and IT staffing solutions. With expertise in areas such as embedded systems, QA automation, ERP, and cloud technologies, Logicplanet supports global clients by combining technical innovation with workforce solutions, positioning itself as both a technology partner and a recruitment facilitator.
Job ID: 146916955