
Search by job, company or skills
JOB DESCRIPTION
We are looking for a Data Architect with strong hands-on experience in GCP (BigQuery) to design scalable data architecture for an
AI-driven platform. The role involves defining data models, semantic layers for NL→SQL use cases, and real-time CDC pipelines,
along with guiding Redshift to BigQuery migration and performance optimization. The candidate should have 10–14 years overall
experience with 3–5 years on GCP, and must have previously designed modern data platforms with a focus on query performance,
data quality, and cost efficiency. Strong collaboration with engineering and AI teams and direct client interaction will be required.
Data Architect – Tools & Project Exposure
Tools / Technologies (Must have hands-on)
• Google BigQuery – data modeling, optimization, query performance
• GCP Services – GCS, Dataflow, Pub/Sub
• CDC Tools – Debezium / Kafka / native streaming alternatives
• SQL – advanced query design and tuning
• Source System – Amazon Redshift (for migration use case)
Experience in cost optimization and performance tuning on cloud platforms
Data Architecture & Design
Designing scalable, high-performance data architectures
Building semantic/curated data layers for analytics and AI use cases
Defining data standards, naming conventions, and governance frameworks
Handling data quality, validation, and consistency checks
Migration & Integration
Experience in data migration (e.g., Redshift/Snowflake → BigQuery)
Converting stored procedures to optimized SQL pipelines
Integrating multiple data sources (batch + streaming)
AI/Analytics Alignment
Understanding of data preparation for AI/ML and NL→SQL systems
Ability to design LLM-friendly schemas and datasets
Soft Skills / Functional Expertise
Strong problem-solving and system thinking
Ability to translate business requirements into technical design
Effective client communication and stakeholder management
Job ID: 146205031