Minimum qualifications:
- Bachelor's degree in Computer Science, Engineering, Mathematics, a related field, or equivalent practical experience.
- 10 years of experience in Big Data, Data Warehouse, Data Modelling, Data Mining, and Hadoop.
- Experience in building multi-tier, high availability applications with modern technologies such as NoSQL, MongoDB, SparkML, and TensorFlow.
- Experience in GCP.
Preferred qualifications:
- Experience in Big Data, information retrieval, data mining, or Machine Learning.
- Experience with IaC and CICD tools like Terraform, Ansible, Jenkins, etc.
- Experience architecting, developing software, or Big Data solutions in virtualized environments.
- Experience with encryption techniques like symmetric, asymmetric, HSMs, and envelope encryption.
- Ability to implement secure key storage using Key Management System.
Responsibilities:
- Interact with stakeholders to translate customer requirements into recommendations for appropriate solution architectures and advisory services.
- Engage with technical leads and partners to lead high velocity migration and modernization to Google Cloud Platform (GCP).
- Help Google Cloud customers with current infrastructure assessment, design and architect goal infrastructure, develop a migration plan, and deliver technical workshops to educate them on GCP.
- Participate in technical and design discussions with technical teams to speed up the adoption process and ensure best practices during implementation.
- Develop and implement data quality and governance procedures to ensure the accuracy and reliability of data.