Senior Data Engineer (Professional Services)
Location: India (100% Remote) Experience: 46 Years Work Dynamics: High-Growth Startup | Global Teams | Flexible Hours
1. Role Summary
Technical lead for global enterprise data projects. High-velocity startup environment. Requires a builder who automates everything with AI. Direct client-facing engagement. Translates business problems into robust technical designs.
2. Tech Stack
- Orchestration: Apache Airflow (Expert: multi-tenant DAGs, SLAs, monitoring).
- Data Platforms: Snowflake (Ingestion, performance tuning) & DuckDB (Local analytics/prototyping).
- Coding: Advanced Python (ETL frameworks) & Expert SQL (Data modeling fundamentals).
- AI Productivity: Proficiency with LLMs/Copilots for code, docs, and self-healing pipelines.
3. Key Responsibilities
- Design and ship end-to-end ELT/ETL pipelines for enterprise customers.
- Automate SQL generation and data quality tasks using AI.
- Lead technical workshops and present solution designs to US/EMEA stakeholders.
- Develop reusable Python frameworks to adapt to varied customer landscapes.
- Manage Agile delivery, sprint planning, and production Hypercare phases.
- Create consulting assets like accelerators and best practice templates.
4. Required Skills
- 57 years in Data Engineering or Technical Consulting.
- Deep knowledge of Snowflake, DuckDB, and Airflow orchestration.
- Expertise in modern data modeling (Star Schema, Kimball).
- Fluent English for presenting trade-offs to international clients.
- Ability to troubleshoot and deliver results with ambiguous requirements.
5. Ideal Mindset & Personality
- AI-First: Ambitious enough to automate manual tasks immediately.
- Global Flexibility: Willing to shift hours for US/EMEA team syncs.
- Creative Laziness: Hates repetitive work; builds tools to avoid it.
- Startup DNA: High ownership and ability to operate with partial information.
- Execution Focused: Balances speed, quality, and client expectations seamlessly.