Cradlepoint is seeking a highly experienced Data Architect to join our Data and Analytics function. This role is crucial in shaping our Analytics and AI strategy, driving multi-functional execution, and transforming Cradlepoint into an Intelligent Enterprise. You will be instrumental in defining and designing robust data solutions, from collection and analytics to machine learning, ensuring scalability, redundancy, and high performance. You will collaborate closely with stakeholders, product owners, developers, and other architects to provide actionable insights and build a secure, scalable data infrastructure.
What You Will Do: Key Responsibilities
- Facilitate requirement analysis with Stakeholders for Data Collection, Analytics, & Machine Learning Requirements.
- Work with the IT Digital Product Owner to ensure a thorough understanding of business requirements in Data Management for the respective functional area.
- Identify and propose new opportunities for data services, driving innovation.
- Define characteristics for desired scale, redundancy, distribution, protection, and other critical criteria for different types of data services.
- Collaborate with Developers and other Functional Area Architects for cross-functional data requirements.
- Provide technical leadership to Scrum teams during development, ensuring architectural integrity.
- Define and design appropriate interfaces for data update, retrieval queries, and recommended workflows based on specific data use cases.
The Skills You Bring: Required Qualifications
- Education: Graduate degree (B.E./B.Tech/M.Tech) in Computer Science, Information Systems, or another quantitative field.
- Experience: Strong prior experience in Data Warehousing and BI, irrespective of technology. Prior experience as an Architect with leading a delivery team. Strong hands-on prior experience as a BI/ETL developer.
- Minimum 10+ years of experience as a Data Architect, Solution Architect, or Data Engineer.
Software/Tools: Required Expertise
- Strong understanding and experience on Snowflake Architecture.
- Strong Data Modelling Experience.
- Experienced in Batch and streaming processing (for instance Apache Spark, Storm) and relevant back-end languages such as Python, PySpark, and SQL.
- Proficient in Software Development including Agile/Scrum, CI/CD, Testing and Configuration Management, GIT/Gerrit.
- Expertise with BI Tools such as Tableau, PowerBI, and also web-based dashboards and solutions.
Preferred Experience and Skills (Merits)
The more of the following experiences and skills you possess, the better:
- Cloud Agnostic Data Platforms such as Snowflake, Databricks, and SAP Data Warehouse Cloud.
- Ability to create design and data modeling independently using Data Vault 2.0.
- Big data technologies, such as Hadoop, Hive, Pig, MAPR, or Enterprise data warehousing initiatives.
- Cloud Native Data Platforms (Azure Synapse Analytics, Amazon Redshift).
- Azure services - Azure Data Factory (ADF), Azure Databricks (ADB).
- Designing data products and publishing data using a data marketplace platform.
- Authorization methods including Role-Based Access Control and Policy-Based Access Control.
- Data mesh implementation and federated model for data product development.
- Relational Data Modeling with 3NF.
- Modern Data Architectures including cloud native, microservices architecture, virtualization, Kubernetes, and containerization.
- Messaging (for instance Apache Kafka).
- NoSQL DBs, like Cassandra and MongoDB.
- RDBMS such as Oracle, MS SQL, MariaDB, or PostgreSQL, both row-based and columnar-based.
- Experience with ETL tools (Talend, Informatica, BODS etc.).