About the Role
Sonata is seeking a highly skilled Solution Archive Architect with strong data engineering and cloud engineering capabilities. In this role, you will be responsible for architecting largescale data platforms, leading application implementation initiatives, and enabling clients to modernize and transform their data ecosystems. You will work closely with business and technology stakeholders to design futureready, compliant, and costefficient data archiving and retention solutions.
Key Responsibilities
Solution Architecture & Design
- Architect modern data archiving and retention solutions aligned with business, regulatory, and compliance requirements.
- Design scalable data engineering workflows, archival pipelines, and storage frameworks across cloud and onprem platforms.
- Create high-level and low-level architecture documents, solution blueprints, and integration patterns.
- Lead the modernization of legacy archival systems toward cloud-native and hybrid models.
Data Engineering & Cloud Engineering
- Build and optimize large data platforms using modern cloud technologies (Azure, AWS, GCP).
- Design ETL/ELT frameworks for long-term data retention, tiered storage, and lifecycle management.
- Ensure solutions follow best-in-class performance, security, governance, and cost optimization principles.
Technical Problem-Solving
- Troubleshoot complex data, storage, and integration challenges across distributed environments.
- Provide SME guidance on archival patterns, data migration, data lineage, and lifecycle management.
- Evaluate existing systems and recommend modernization or transformation strategies.
Cross-Functional Collaboration
- Collaborate with business leaders, compliance teams, and technical stakeholders to align architecture
- Support client workshops, requirement gathering, and technical advisory engagements.
- Communicate architectural decisions clearly to both technical and non-technical audiences.
Skills & Experience Required
Core Technical Skills
- Strong hands-on expertise in data engineering, data architecture, and cloud platforms (Azure preferred; AWS/GCP a plus).
- Deep understanding of large data platforms such as Hadoop, Spark, Databricks, Snowflake, Synapse, BigQuery, Redshift, etc.
- Experience implementing enterprise archiving solutions, including cold storage, nearline storage, and compliance-driven retention.
- Proficiency in programming/scripting: Python, SQL, PySpark, or equivalent.
- Knowledge of data governance, metadata management, and information lifecycle management.
Legacy & Traditional Platforms
- Breadth of experience across older data platforms (e.g., Teradata, Oracle Exadata, SAP BW, Mainframe data systems).
- Experience migrating or archiving data from legacy environments into cloud or hybrid storage solutions.
Soft Skills
- Strong communication and stakeholder management skills.
- Ability to translate business needs into technical requirements.
- Strong analytical and problem-solving capabilities.
- Ability to lead teams and work in a collaborative, fast-paced consulting environment.
Preferred Qualifications
- 10+ years in data engineering, data architecture, or enterprise architecture roles.
- Experience in large-scale digital transformation programs.
- Cloud certifications (Azure Solutions Architect, AWS Solutions Architect, GCP Architect).
- Experience working in consulting or client-facing environments.