Search by job, company or skills

  • Posted 3 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Description

Key Responsibilities :

  • Design and develop scalable data pipelines using Google Cloud Platform (GCP)
  • Build and maintain ETL processes for data transformation and integration
  • Work with Big Data technologies to process large datasets efficiently
  • Develop data solutions using Python and SQL
  • Implement and optimize data warehousing solutions
  • Utilize Apache Spark for distributed data processing
  • Collaborate with BI teams to enable reporting and analytics
  • Ensure data quality, performance, and reliability of systems
  • Work in an Agile environment and collaborate with cross-functional teams

Required Skills

  • 6 - 9 years of experience in Data Engineering
  • Strong hands-on experience with Google Cloud Platform (GCP)
  • Proficiency in Python and SQL
  • Experience with ETL development and data warehousing concepts
  • Hands-on experience with Apache Spark
  • Good understanding of Big Data ecosystems
  • Exposure to Business Intelligence (BI) tools and reporting systems

(ref:hirist.tech)

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 143407305

Similar Jobs