Job Title : Data Engineer Lead/Architect (Pre-Sales /Delivery) -Snowflake & Databricks.
Job Summary: We are seeking a highly experienced Snowflake and Databricks Lead/Architect who can operate in a dual capacity driving pre-sales solutioning and client engagement while also leading architecture, design, and delivery execution. This role requires strong technical depth in modern data platforms along with excellent communication and stakeholder management skills.
The ideal candidate will bridge business requirements with scalable cloud data solutions, support sales pursuits with compelling technical narratives, and lead implementation teams to deliver high-quality data engineering and analytics solutions.
Key Responsibilities
1. Pre-Sales & Solutioning
- Partner with sales teams to understand client requirements and craft winning technical solutions.
- Design end-to-end data architectures using Snowflake and Databricks on AWS/Azure/GCP.
- Lead technical discovery workshops and customer presentations.
- Develop solution proposals, effort estimations, architecture diagrams, and RFP responses.
- Conduct POCs, demos, and performance benchmarking.
- Provide thought leadership on modern data platforms, lakehouse architecture, and AI/ML enablement.
2. Architecture & Technical Leadership
- Define enterprise data architecture standards and best practices.
- Design scalable, secure, and cost-optimized solutions using:
- Snowflake (Data Warehousing, Data Sharing, Snowpark, Performance Optimization)
- Databricks (Delta Lake, Spark, Unity Catalog, MLflow)
- Lead data ingestion, transformation, and orchestration frameworks.
- Implement CI/CD, DevOps, and data governance best practices.
- Ensure performance tuning, cost optimization, and security compliance.
3. Delivery & Team Leadership
- Lead cross-functional data engineering teams.
- Provide technical mentoring and code reviews.
- Ensure high-quality and timely project delivery.
- Act as technical escalation point for complex issues.
- Collaborate with product owners, business stakeholders, and cloud teams.
Required Skills & Experience
Technical Skills
- 8+ years in Data Engineering / Data Architecture.
- Strong expertise in:
- Snowflake (Advanced SQL, Warehouses, Cloning, Data Sharing, RBAC)
- Databricks (PySpark, Delta Lake, Workflows, Notebooks, Unity Catalog)
- Deep understanding of Lakehouse and Data Warehouse architecture.
- Experience with AWS / Azure / GCP cloud ecosystems.
- Proficiency in Python and SQL.
- Experience with ETL/ELT tools and orchestration (Airflow, ADF, dbt, etc.).
- Data modeling (Star/Snowflake schema, Dimensional modeling).
- Experience in real-time and batch data processing.
Pre-Sales Skills
- Experience supporting RFP/RFI responses.
- Strong client-facing presentation and demo skills.
- Ability to translate business problems into technical architecture.
- Experience building solution accelerators and reusable frameworks.
Leadership & Soft Skills
- Excellent communication and stakeholder management skills.
- Strong documentation and architecture diagramming ability.
- Experience leading distributed teams.
- Ability to work in fast-paced, client-driven environments.
Preferred Qualifications
- Snowflake and/or Databricks certifications.
- Experience with Data Governance & MDM tools.
- Knowledge of AI/ML workloads on Databricks.
- Experience in industry domains such as BFSI, Retail, Healthcare, or Manufacturing.
Education
- Bachelor's or master's degree in computer science, Engineering, or related field.