Data Engineer GCP / BigQuery (D365 & Salesforce Integrations)
Location: Remote
Role Summary Seeking a pragmatic Data Engineer to design, build, and operate ETL pipelines that move and transform data from Microsoft Dynamics 365 and Salesforce into Google BigQuery using Google Dataflow/Dataform. The role focuses on periodic ingestion from multiple sou
rces, ensuring data freshness, accuracy, and correctness for CFO-facing reporting. Final datasets will feed Power BI dashboards and reports used by the finance organization.
Key Responsibilities
- Design and implement scalable ETL/ELT pipelines in Google Cloud (Dataflow, Dataform, Cloud Composer/Cloud Functions as needed) to extract data from D365, Salesforce, and other sources into BigQuery.
- Build incremental extract strategies, change-data-capture patterns, and efficient load processes for periodic data synchronization.
- Develop robust transformation, enrichment, and data modeling in BigQuery to create finance-ready datasets suitable for Power BI consumption.
- Implement data quality, validation, and reconciliation checks (completeness, referential integrity, business-rule validations) and automated alerting on anomalies.
- Optimize BigQuery schemas, partitions, and query performance; manage cost-efficient storage and compute usage.
- Create and maintain CI/CD pipelines for ETL code, tests, and deployments.
- Document data lineage, mappings, runbooks, and operational procedures; support production runbook and on-call rotations as required.
- Partner with business stakeholders (finance/CFO team), BI developers, and integration teams to define requirements, data contracts, and SLAs around freshness and accuracy.
- Support UAT, troubleshooting, and production support during go-live and hypercare.
Required Qualifications
- 5+ years experience in data engineering or similar role, with hands-on work in GCP and BigQuery.
- Practical experience extracting data from Salesforce and/or Dynamics 365 (APIs, bulk APIs, connectors).
- Hands-on experience with Google Dataflow, Dataform, Cloud Composer (Airflow), or equivalent orchestration/ETL frameworks.
- Strong SQL skills and experience designing data models for analytics in BigQuery (denormalized tables, partitioning, clustering).
- Experience implementing data quality checks, reconciliation processes, and monitoring.
- Familiarity with Power BI data consumption patterns (dataset design, refresh scheduling) and collaboration with BI teams.
- Familiarity with source control (Git) and CI/CD practices for data pipelines.
- Good communication skills and experience collaborating with business stakeholders, especially finance teams.
Preferred Skills
- Experience with CDC patterns, Pub/Sub, streaming ingestion, or near-real-time pipelines.
- Knowledge of cost optimization techniques in BigQuery.
- Experience with other data platforms (Snowflake, Redshift) or middleware (MuleSoft, Boomi).
- Background working on CFO/finance reporting projects or financial data domains.
- GCP certifications (Professional Data Engineer or equivalent) or relevant cloud/data certifications.