
Search by job, company or skills
Data Architect/Engineer
Fulltime
Bengaluru, kA
Role Overview
The Senior Data Solution Lead is the point of contact for customers across the full data project lifecycle. This role sits at
the intersection of business understanding, data expertise, and delivery discipline owing to discovery, data modeling,
profiling, issue recommendation, and ensuring the Data Architect receives clean, sign-off inputs before any building
begins.
What Makes This Role Different
SQL-driven data profiling: Uncovers source system data issues independently, before they become migration blame
Recommendation-driven: Never surfaces a problem without structured options and a preferred path forward
Incremental delivery mindset: Breaks complex scope into sprint-sized, validated data releases
End-to-end ownership: Bridges business and technical so nothing falls through the gap
Key Responsibilities
Discovery & Customer Engagement
• Point of contact — leads discovery sessions, presents findings, obtains signoffs.
• Understands source and target system business & data flows firsthand.
• Trusted advisor — brings problems to the surface early, always with a recommendation.
Data Modeling, Profiling & Quality Advocacy
• Profiles source data independently using SQL — completeness, duplicates, referential integrity, anomalies, field Accuracy.
• Produces Source Data Quality Baseline Report before mapping begins.
• For every issue found: documents root cause, business impact, handling options, and recommended resolution.
• Clearly separates source system issues from migration issues — with evidence .
Confident Data Conversations
• Leads evidence-based data quality conversations with senior customer stakeholders
• Presents structured recommendations — options, consequences, preferred approach
• Ensure all data handling decisions are formally documented and signed off
• Never let a post-migration issue surface without prior documentation and agreed resolution
Documentation & Sign-off Gate
• Drives BA to produce current state workflow, future state workflow, business rules inventory, data dictionary, gap analysis, and quality baseline report
• Validates all documentation for technical and business accuracy
• No build begins without customer sign-off — enforces this as a hard gate
Delivery, Architecture Enablement & Governance
• Feeds Data Architect with structured, signed-off, assumption-free inputs
• Create data model, data mapping outputs against profiling findings and business intent — challenges where needed.
• Plans delivery in sprint slices — profiling findings determine sequencing
• Enforces change control, flags risks early, contributes learnings back to the Data Practice.
Skills & Experience Required
Customer, Communication & Delivery
Data Profiling, Modeling & SQL & Technical
• Strong, hands-on SQL — profiling queries, anomaly investigation, reconciliation counts, independent of engineers.
• Experienced in data profiling, quality baseline reporting, gap analysis, and structured issue recommendations.
• Working experience of data modeling — dimensional, relational, or flat — sufficient to create & review and challenge mapping outputs.
• Experienced of data migration, integration, data mapping, and transformation logic.
• Exposure to common data platforms — cloud warehouses, ETL tools, profiling frameworks such as Talend, Informatica, or similar.
• Source system experience across EMR, ERP, CRM, MWS, legacy databases, Snowflake, Redshift, Azure Data Factory, dbt advantageous.
Job ID: 147495673
Skills:
data engineering , BigQuery, Api Integration, Data Modeling, Pyspark, Kafka, Dataproc, Microservices, Spark, Data Architecture, DataFlow, Airflow, Data Lake ETL ELT, Pub Sub, GCP Google Cloud Platform
Skills:
snowflake , Data Warehouse, Metadata Management, Aws Redshift, Kafka, Data Modeling, Informatica, ELT, Azure Synapse, Distributed Systems, Oozie, Talend, Data Lake, Python, AWS, Java, Hadoop, Scala, Sql, Spark Streaming, Azure Data Factory, Gcp, Spark, Data Governance, Databricks, Data Integration, Azure, Etl, Airflow, Google BigQuery, NoSQL databases, cloud ecosystems
Skills:
Alteryx, Pyspark, Kafka, Azure Databricks, Informatica, Redis, Azure Data Factory, Kinesis, Spark, Ab Initio, Talend, Python, Azure DevOps, Airflow, Logic Apps, Event Hub, GitHub Actions, dbt, Delta Lake, CI-CD, Azure Monitor
Skills:
snowflake , SQL Server, Tableau, Sql, Gcp, Powerbi, MySQL, Postgres, Azure, Oracle, AWS, Looker, Teradata
Skills:
snowflake , BigQuery, Gcp, Kafka, Databricks, Python, Sql, AWS, Airflow, Spanner
We don’t charge any money for job offers