We are seeking a part-time Cloud Engineer to design, implement, and maintain scalable cloud-based data pipelines and integration solutions across Azure and Google Cloud platforms.
Key Responsibilities
- Execute end-to-end data migration and replication from Azure (Cosmos DB, Blob Storage) to Google Cloud Storage and other GCP services, ensuring secure, performant transfers across cloud boundaries.
- Design, develop, and manage end-to-end data ingestion and replication pipelines using Azure Cosmos DB Change Feed and Azure Functions.
- Architect and implement secure, high-throughput integrations from Azure to Google Cloud services (e.g., Firestore, BigQuery).
- Build and maintain infrastructure as code (IaC) for automated deployment and configuration management in Azure and GCP environments.
- Implement robust error handling, retry logic, and monitoring to ensure data consistency and system reliability.
- Collaborate with cross-functional teams to define data requirements and deliver integration solutions that meet business objectives.
Required Qualifications
- Bachelor’s degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in cloud data engineering, with a focus on Azure and GCP platforms.
- Proven expertise with Azure Cosmos DB (NoSQL) and the Change Feed mechanism.
- Strong proficiency in developing serverless functions in C#/.NET and/or Python.
- Experience with infrastructure as code tools such as Terraform, ARM templates, or Deployment Manager.
- Solid understanding of cloud networking, security best practices, and IAM configurations.
Preferred Skills
- Hands-on experience with Google Cloud Firestore, BigQuery, and Dataflow.
- Familiarity with CI/CD pipelines using Azure DevOps, GitHub Actions, or equivalent tooling.
- Excellent communication skills and a proven track record of collaborating with distributed teams.
- Certifications such as Microsoft Certified: Azure Data Engineer Associate or Google Professional Data Engineer.