Role: Data Engineering Solution Architect
- Core Mission: Designing the Future of Fact-Based Intelligence
- Investment: 40L 48L Per Annum
- Location: Hyderabad (Strategic Hub 5 Days On-site)
- Notice Period: Immediate to 30 Days (Non-negotiable)
- Key Tech Pillars: Snowflake, Databricks, dbt, Python, Airflow, & Multi-Cloud (AWS/Azure/GCP)
I. The Mandate: Architecting at Scale
As a Solution Architect, you are the primary technical visionary for our Data Practice. You will not just oversee pipelines; you will engineer the Digital Sovereignty of our global clients. You will bridge the gap between complex business problems and elegant, scalable technical solutionsdesigning architectures that support millions of transactions and power real-time AI/ML insights. This is a role for a Leader-Architect who thrives in the high-pressure environment of strategic consulting.
II.Strategic Pillars of Responsibility
2. Streaming, APIs, and Real-Time Integration3. Thought Leadership & Team MultiplicationIII. The Pedigree (Candidate Blueprint)IV. Performance BenchmarksV. Why Join the Practice
- End-to-End Architectural Governance
- Modern Data Stack Design: Lead the blueprinting of high-performance Snowflake and Databricks ecosystems. You will own the transition from traditional ETL to high-velocity ELT frameworks using dbt and Matillion.
- Medallion & Layered Frameworks: Architect multi-layered data lakes (Bronze, Silver, Gold) following the Medallion Architecture, ensuring data quality, lineage, and governance are embedded at every stage.
- Hybrid & Multi-Cloud Orchestration: Design resilient data movement strategies across AWS, Azure, and GCP, leveraging cloud-native services like BigQuery, Redshift, and Athena.
- Event-Driven Ingestion: Architect solutions for real-time and micro-batch data streaming using Kafka, Kinesis, or Pub/Sub.
- API Sovereignty: Lead the design for API-based data integration, ensuring sub-second latency for critical reporting and AI-driven applications.
- Orchestration Mastery: Define global standards for workflow automation using Airflow, Control-M, or equivalent enterprise schedulers.
- Capability Building: Act as the Engineering Standard-Bearer, defining reusable assets and frameworks to improve delivery velocity across the organization.
- Mentorship: Directly mentor senior data engineers, fostering a culture of technical excellence and continuous learning.
- Visualization Strategy: Oversee the delivery of sophisticated reporting layers using Streamlit, Power BI, or Tableau, turning cold data into hot insights.
- Experience: 812 years in core Data Engineering and Architecture. You must have led at least 23 large-scale migrations from legacy to modern cloud platforms.
- Technical Depth: * Expert-level Python/PySpark and SQL.
- Deep expertise in Snowflake or Databricks (Certifications like SnowPro or Databricks Architect are highly preferred).
- Familiarity with Traditional ETL (Informatica/SSIS) to manage legacy-to-cloud transitions.
- AI/ML Integration: Exposure to integrating AI/ML modules within data pipelines is a mandatory requirement for this evolution of the role.
- Architectural ROI: Reducing client data latency and cloud compute costs through efficient performance tuning.
- Delivery Excellence: Scaling the internal knowledge base of dbt macros, RBAC hierarchies, and CI/CD templates.
- Stakeholder Confidence: Leading successful client demos and technical defense of proposed blueprints.
- C-Suite Influence: You will report into the leadership and have a direct say in the practice's technology roadmap.
- Complexity: Solve Big Four level problems with the agility of a specialized tech firm.
- Wealth & Growth: A premium compensation package of up to 48 LPA with aggressive career growth into Practice Leadership.
Skills: databricks,airflow,snowflake,python,multi-cloud (aws/azure/gcp),data