We are seeking an experienced Cloud Data Architect to design, implement, and evangelize scalable, secure data architectures in a cloud environment. The ideal candidate will be a technical expert who not only drives delivery excellence but also collaborates with our sales and pre-sales teams to create assets and artifacts that support business development and client acquisition. This role is perfect for a professional who thrives at the intersection of technical architecture and strategic business growth.
Main Responsibilities
Technical Architecture & Delivery
- Design and build robust cloud-based data platforms, including data lakes, data warehouses, and real-time data pipelines.
- Ensure data quality, consistency, and security across all systems and projects.
- Work closely with cross-functional teams to integrate diverse data sources into a cohesive and performant architecture.
Sales & Pre-Sales Support
- Serve as a technical advisor in sales engagements to develop and articulate compelling cloud data solutions.
- Create reusable assets, solution blueprints, and reference architectures to support RFP responses and sales proposals.
- Present technical roadmaps and participate in client meetings to demonstrate the value of our solutions to prospective clients.
Cross-Functional Collaboration
- Act as a liaison between technical delivery and sales teams to ensure a seamless alignment of strategies.
- Mentor team members and lead technical discussions on architecture best practices.
Required Qualifications
- A Bachelor's or Master's degree in Computer Science, Information Systems, or a related field.
- Extensive experience in data architecture and design within cloud environments.
- A proven track record of supporting pre-sales initiatives and business development.
- Demonstrated experience with cloud data platforms and big data technologies.
- Strong analytical, problem-solving, and communication skills.
Key Skills & Technologies
Must-Have
- Cloud Platforms: Proficiency in AWS, Azure, or Google Cloud.
- Data Warehousing & Data Lakes: Hands-on experience with technologies like Redshift, BigQuery, or Snowflake.
- Big Data Technologies: Experience with Hadoop, Spark, and Kafka.
- Databases: Strong skills in both SQL & NoSQL databases.
- ETL/ELT: Expertise in using various tools and pipelines.
- Data Modeling: Advanced knowledge of data modeling and architecture design.
Good-to-Have
- Infrastructure-as-Code (IaC): Familiarity with Terraform or CloudFormation.
- Containerization: Experience with Docker and Kubernetes.
- Programming: Proficiency in Python, Java, or Scala.
- Data Governance: Knowledge of data governance and security best practices.