GCP Data Engineer position

5-15 years
5 months ago 373 Applied
Job Description

1. Overall experience of 5.5 years with Minimum 4 years of relevant experience in Big Data
technologies.

2. Hands-on experience with the Hadoop stack HDFS, sqoop, kafka, Pulsar, NiFi, Spark,
Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in
building an end to end data pipeline. Working knowledge on real-time data pipelines is
added advantage.
3. Strong experience in at least of the programming language Java, Scala. Java
preferable

4. Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb,
Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery
etc.
5. Well-versed and working knowledge with data platform related services on GCP
6. Bachelor's degree and year of work experience of 6 to 8 years or any combination of
education, training and/or experience that demonstrates the ability to perform the
duties of the position.

JOB TYPE

Function

IT

Education

Bachelor Of Technology (B.Tech/B.E)

FutureA4 is a boutique search firm started by Jay Menon, who has nearly 4 decades of global leadership experience in various Corporates, last being Senior Vice President and Global Head of Talent Acquisition with Polaris Consulting & Services Limited for almost nine years. FutureA4 highly focused on all type of roles in technical and non technical areas for Leadership and lateral hiring for BFSI, Manufacturing, Retail, Hospital, Healthcare and Wellness domains. To know more, please refer website www.futurea4.com and Linkedin profile https://www.linkedin.com/in/jaymenon/

Career Advice to Find Better