Job Description
Strong hands-on experience with Google BigQuery Advanced proficiency in SQL for data analysis and transformation Experience working on large-scale data warehousing and analytics solutions Solid understanding of GCP ecosystem, especially BigQuery architecture Experience with ETL/ELT pipelines Knowledge of data modeling concepts (star schema, snowflake, denormalization) Strong analytical and problem‑solving skills Ability to work in an Agile delivery model
Design, develop, and maintain data models and datasets using Google BigQuery Build and optimize complex SQL queries, views, stored procedures, and UDFs Implement data ingestion and transformation pipelines using GCP services (Dataflow, Dataproc, Cloud Composer, Pub/Sub, etc.) Ensure performance optimization, cost optimization, and data quality in BigQuery Collaborate with business analysts and stakeholders to understand reporting and analytics requirements Implement data governance, security, and access controls using IAM and GCP best practices Support production deployments, monitoring, and troubleshooting of data pipelines Mentor junior team members and contribute to best practices and reusable frameworks Participate in architecture reviews and technical discussions