
Search by job, company or skills
Key Responsibilities
• Design, build, and maintain scalable data pipelines and data platforms on AWS cloud
• Develop and optimize data processing workflows using Databricks, PySpark, and Scala
• Implement and manage data ingestion, transformation, and storage solutions for large-scale systems
• Work with SQL for data extraction, transformation, and analysis across complex datasets
• Build and support cloud-based data architectures aligned with modern best practices
• Implement event-driven data processing architectures for real-time and batch use cases
• Collaborate with data analysts, data scientists, and engineering teams to support data requirements
• Ensure data quality, performance optimization, and system reliability across platforms
• Develop reusable and efficient code for scalable data processing solutions
• Support data visualization and analytics requirements for business insights
• Work with distributed systems handling large volumes of structured and unstructured data
• Troubleshoot and resolve data pipeline and platform issues in production environments
Logicplanet IT Services (India) Pvt. Ltd., incorporated in 2007 and headquartered in Hyderabad, operates as a software publishing, consulting, and IT solutions provider. The company delivers enterprise technology services including software development, digital transformation, and IT staffing solutions. With expertise in areas such as embedded systems, QA automation, ERP, and cloud technologies, Logicplanet supports global clients by combining technical innovation with workforce solutions, positioning itself as both a technology partner and a recruitment facilitator.
Job ID: 146981559