Job Overview
We are seeking a skilled and motivated Snowflake Developer to join our team at Capgemini in Bengaluru. As a Snowflake Developer, you will be instrumental in building, optimizing, and maintaining our cloud-based data warehouse solutions using Snowflake. The successful candidate will possess a strong background in data architecture, ETL processes, and SQL development, with a focus on delivering scalable and efficient data solutions.
Key Deliverables
- Design, develop, and maintain data pipelines and ETL processes to load and transform data into Snowflake.
- Implement and optimize SQL queries, stored procedures, and user-defined functions within the Snowflake environment.
- Collaborate with data architects and engineers to design and implement data models and database schemas.
- Monitor and troubleshoot data pipeline performance, ensuring data quality and reliability.
- Develop and maintain comprehensive documentation for data models, ETL processes, and database configurations.
Essential Requirements
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- Minimum 2 years of hands-on experience with Snowflake implementation.
- Solid understanding of data warehousing concepts, principles, and best practices.
- Experience with data governance and data quality frameworks within Snowflake.
Preferred Qualifications
- Experience with cloud platforms such as AWS or Azure.
- Familiarity with Snowflake's integration with AWS and Azure environments.
- Experience in leading teams and managing complex data migration projects.
Skills
Must-Have Skills
- Technical: Proficient in Python for data processing and automation; Strong SQL skills for data manipulation and querying within Snowflake; Experience with Git for version control.
- Domain Knowledge: Solid understanding of data warehousing principles, ETL processes, and data modeling techniques.
- Behavioral & Interpersonal: Strong communication skills to collaborate with cross-functional teams and stakeholders; Ability to explain technical concepts to non-technical audiences.
- Process & SOP: Experience in developing and maintaining documentation for data models, ETL processes, and database configurations.
- Analytical & Problem-Solving: Ability to optimize SQL queries and troubleshoot data pipeline performance issues.
Good-to-Have Skills
- Advanced Technical: Experience with Snowflake features and AI capabilities; Knowledge of automation tools for data pipeline orchestration.
- Additional Certifications: Snowflake certifications, AWS or Azure certifications.
- Cross-Functional Exposure: Experience working with data scientists, business analysts, and other stakeholders to understand data requirements.
- Leadership Traits: Ability to mentor junior developers and provide technical guidance.
- Continuous Improvement: Familiarity with Agile methodologies and continuous integration/continuous deployment (CI/CD) practices.