We are seeking an experienced GCP Data Engineer proficient in data processing frameworks, programming languages, and GCP services. The ideal candidate will have hands-on experience with cloud-based solutions, particularly GCP services such as BigQuery, Dataflow, and Spanner. Strong problem-solving abilities, a solid understanding of data security, and expertise in ETL processes are essential.
Key Responsibilities:
- Develop and implement data processing solutions using frameworks such as Apache Beam (Data Flow) and Kafka.
- Work with GCP services like BigQuery, Dataflow, and Spanner to optimize data pipelines and workflows.
- Design and model data structures for efficient data storage and retrieval.
- Oversee ETL processes to ensure seamless data extraction, transformation, and loading.
- Collaborate with cross-functional teams to address data engineering challenges and improve data systems.
- Ensure scalability and security of data infrastructure while maintaining high performance.
- Leverage Apache Airflow or similar tools to manage workflows and pipelines.
Key Requirements:
- Proficiency in programming languages like Python, Java, or Scala.
- Expertise in Apache Beam, Kafka, and GCP services such as BigQuery, Dataflow, and Spanner.
- Experience in data modeling, database design, and ETL processes.
- Strong problem-solving abilities and the ability to manage complex data engineering tasks.
- Familiarity with cloud storage solutions and data security best practices.
- Understanding of scalability principles in cloud-based data engineering environments.