
Search by job, company or skills
GCP Data Engineer:
Exp : 3 to 12 years
Data Pipeline Development Build and manage robust scalable data pipelines for batch and streaming data
GCP Service Utilization Leverage core GCP services
Storage Cloud Storage BigQuery
Processing Dataflow Dataproc PubSub
Orchestration Cloud Composer Airflow
Infrastructure Compute Engine Kubernetes Engine GKE
Data Warehousing Modeling Design and optimize data warehouses BigQuery for analytical queries
ETLELT Processes Implement data extraction transformation and loading
Automation CICD Automate deployments and tasks using Terraform and Cloud Build
Monitoring Optimization Implement monitoring ensure performance and troubleshoot issues
Security Compliance Adhere to security best practices and regulatory standards
Job ID: 147523305
Skills:
composer , Unix, BigQuery, Hadoop, Uc4, Kafka, Sql, Control M, Gcp, RDBMS, Linux, Cicd, DataFlow, Python, Airflow, Pubsub, Atomic, Google Cloud SDK, Cloud SQL, GCS, NiFi
Skills:
bigtable , BigQuery, Networking, ELT, Cloud Storage, Storage, Iam, DataFlow, Python, Etl, DevOps practices, Cloud Build, Compute, Spanner
Skills:
Spark SQL, Unix, Pyspark, Sql, GCP Cloud, ETL pipelines
Skills:
composer , Java, Pyspark, BigQuery, Sql, Gcp, Dataproc, Datawarehouse, DataFlow, Python, Etl, Airflow, Dataform, Pub Sub, Cloud Run, data fusion, dbt, GCS
Skills:
DataFlow, BigQuery, Sql, ELT, Etl, Apache Airflow, Data Warehousing, Python, Cloud Storage, Cloud Composer
We don’t charge any money for job offers