Search by job, company or skills

Brillio

UMG- GCP Data Specialist - R01559803

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted a day ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Lead AI/ML Engineer

Specialization

Key Skills

Data Engineering & Analytics

  • SQL (Advanced)
  • Data Modeling & Query Optimization
  • BigQuery (DDL, Views, Authorized Views)
  • ETL / ELT Pipeline Development

Cloud & GCP Services

  • Google Cloud Platform (GCP)
  • BigQuery
  • Google Cloud Storage (GCS)
  • Dataflow
  • Pub/Sub

Programming & Tools

  • Python (data processing, scripting)
  • Java (basic understanding for data workflows)
  • Apache Airflow (orchestration and scheduling)
  • API Integration

Visualization & Reporting

  • Looker Studio (dashboard development)
  • Automated reporting using SQL

Roles & Responsibilities

  • Design, develop, and optimize scalable data pipelines to ingest, process, and store data from Google Cloud Storage (GCS) to BigQuery.
  • Build and maintain API-integrated workflows to extract data from external systems, process responses, and load results into downstream platforms.
  • Implement Pub/Sub-based event-driven architectures supporting both real-time and batch data processing, including message handling and pipeline triggers.
  • Create and manage BigQuery DDLs, views, and authorized views, ensuring secure access through appropriate roles and permissions.
  • Optimize ETL pipelines and complex SQL queries to improve performance, reduce processing time, and enhance BigQuery warehouse efficiency.
  • Perform DEV and UAT testing, validating business logic, data quality, and end-to-end pipeline stability.
  • Apply business transformation logic to convert raw datasets into analytics- and reporting-ready data models.
  • Develop weekly and monthly SQL-based reports and automate their distribution to business stakeholders via email.
  • Design and publish interactive dashboards using Looker Studio to enable clear data visualization and actionable insights.
  • Collaborate closely with clients, business analysts, and stakeholders to gather requirements and deliver accurate, efficient, and interpretable data solutions.

Required Qualifications

  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • Strong experience in SQL and BigQuery within large-scale data environments.
  • Hands-on experience with GCP services, especially BigQuery, GCS, Dataflow, and Pub/Sub.
  • Proficiency in Python for data manipulation and pipeline development.
  • Experience with Apache Airflow or similar orchestration tools.
  • Solid understanding of data modeling, ETL best practices, and performance tuning.

Good to Have

  • Exposure to Java-based data processing frameworks
  • Experience working in client-facing or consulting environments
  • Knowledge of real-time data processing patterns and event-driven architectures

Job requirements

Key Skills

Data Engineering & Analytics

  • SQL (Advanced)
  • Data Modeling & Query Optimization
  • BigQuery (DDL, Views, Authorized Views)
  • ETL / ELT Pipeline Development

Cloud & GCP Services

  • Google Cloud Platform (GCP)
  • BigQuery
  • Google Cloud Storage (GCS)
  • Dataflow
  • Pub/Sub

Programming & Tools

  • Python (data processing, scripting)
  • Java (basic understanding for data workflows)
  • Apache Airflow (orchestration and scheduling)
  • API Integration

Visualization & Reporting

  • Looker Studio (dashboard development)
  • Automated reporting using SQL

Roles & Responsibilities

  • Design, develop, and optimize scalable data pipelines to ingest, process, and store data from Google Cloud Storage (GCS) to BigQuery.
  • Build and maintain API-integrated workflows to extract data from external systems, process responses, and load results into downstream platforms.
  • Implement Pub/Sub-based event-driven architectures supporting both real-time and batch data processing, including message handling and pipeline triggers.
  • Create and manage BigQuery DDLs, views, and authorized views, ensuring secure access through appropriate roles and permissions.
  • Optimize ETL pipelines and complex SQL queries to improve performance, reduce processing time, and enhance BigQuery warehouse efficiency.
  • Perform DEV and UAT testing, validating business logic, data quality, and end-to-end pipeline stability.
  • Apply business transformation logic to convert raw datasets into analytics- and reporting-ready data models.
  • Develop weekly and monthly SQL-based reports and automate their distribution to business stakeholders via email.
  • Design and publish interactive dashboards using Looker Studio to enable clear data visualization and actionable insights.
  • Collaborate closely with clients, business analysts, and stakeholders to gather requirements and deliver accurate, efficient, and interpretable data solutions.

Required Qualifications

  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • Strong experience in SQL and BigQuery within large-scale data environments.
  • Hands-on experience with GCP services, especially BigQuery, GCS, Dataflow, and Pub/Sub.
  • Proficiency in Python for data manipulation and pipeline development.
  • Experience with Apache Airflow or similar orchestration tools.
  • Solid understanding of data modeling, ETL best practices, and performance tuning.

Good to Have

  • Exposure to Java-based data processing frameworks
  • Experience working in client-facing or consulting environments
  • Knowledge of real-time data processing patterns and event-driven architectures

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147316645