
Search by job, company or skills
Qualification
Need to hire GCP enabled Module Leads and Leads with proficiency on data engineering technologies and languages. These folks should be able to to drive and lead migration from On Prem to GCP
Role
5-10 years of experience in the role of implementation of high end software products.
Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies.
Must have : Operating knowledge of cloud computing platforms (GCP, especially Big Query, Dataflow, Dataproc, Storage, VMs, Networking, Pub Sub, Cloud Functions, Composer servics)
Should be aware with columnar database e.g parquet, ORC etc
Visualize and evangelize next generation infrastructure in Cloud platform/Big Data space (Batch, Near Real-time, Real-time technologies).
Passionate for continuous learning, experimenting, applying and contributing towards cutting edge open source technologies and software paradigms
Developing and implementing an overall organizational data strategy that is in line with business processes. The strategy includes data model designs, database development standards, implementation and management of data warehouses and data analytics systems
Expert-level proficiency in at-least 4-5 GCP services
Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities
Strong understanding and experience in distributed computing frameworks, particularly
Experience working within a Linux computing environment, and use of command line tools including knowledge of shell/Python scripting for automating common task
Experience
5 to 10 years
Job ID: 144797915
Skills:
Java, BigQuery, Hadoop, Pyspark, Spring Boot, Sql, UNIX, Tensorflow, Restful Web Services, Cloud Storage, Git, Hive, Gcp, shell scripting, Linux, Perl, Spark, Dataproc, DataFlow, Python, Scikit-learn, Cloud Composer
Skills:
BigQuery, Hadoop, Pyspark, Data Warehousing, Bash, Dataproc, Sparksql, Sql, Cloud Storage, Hive, Gcp, Iam, DataFlow, Python, Airflow, DataFrame, Pub Sub
Skills:
composer , Hadoop, Networking, Big Data, Dataproc, HBase, Python Scripting, Hive, Storage, Gcp, Linux, Spark, Vms, DataFlow, orc, Big Query, Cloud Functions, HDFS, parquet, Pub Sub
Skills:
BigQuery, ELT, Jenkins, Gcp, Terraform, Dataproc, Data Warehousing, DataFlow, Talend, Python, Etl, Airflow, Pubsub, dbt
Skills:
DataFlow, Sql, ELT, Data Modeling, Cloud Storage, Data Warehousing, MLops, BigQuery, Etl, Data Governance, Python, Vertex AI, Cloud Composer, Data lakes, Data quality checks, dbt
We don’t charge any money for job offers