Search by job, company or skills

  • Posted 13 hours ago
  • Be among the first 10 applicants
Early Applicant
Quick Apply

Job Description

Roles & Responsibilities:

  • Design, develop, and optimize data processing pipelines using PySpark
  • Collaborate with multiple teams to contribute to key technical and business decisions
  • Provide effective solutions to complex problems across teams
  • Develop and manage Airflow workflows (DAGs) for data orchestration
  • Perform data analysis and validation using SQL
  • Monitor, debug, and optimize performance of distributed data processing systems
  • Work with AWS services such as EMR, S3, and IAM for data pipeline deployment
  • Ensure efficient deployment and performance of applications on cloud platforms
  • Mentor junior team members and support their technical growth
  • Continuously improve software development processes and ensure adherence to Agile methodologies 

More Info

Job Type:
Industry:
Function:
Employment Type:
Open to candidates from:
Indian

About Company

Apptad offers strategic consulting, enterprise information management and digital transformation services. With globally connected offices in US and India along with a team of trained and certified IT resources, Apptad ensures quick and effective delivery to its customers. Apptad is relentlessly reinventing the outlook of how companies leverage data.

With an effort to enable our customers the ability to solve biggest problems within their organization. We perceive our clients’ problems and respond with custom solutions instead of handing over boilerplate responses.

Job ID: 145631639

Similar Jobs