Search by job, company or skills

Genpact

Principal Consultant- Sr. Snowflake Data Engineer (Python+Cloud)!?

Fresher
new job description bg glownew job description bg glownew job description bg svg
  • Posted 11 days ago
  • Be among the first 40 applicants
Early Applicant

Job Description

Inviting applications for the role of Principal Consultant- Sr. Snowflake Data Engineer (Python+Cloud)!

In this role, the Sr. Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal.

Job Description:

  • Design, develop, and optimize scalable data pipelines on Snowflake using Python and cloud-native tools.

  • Build interactive, user-friendly data applications and dashboards using Streamlit to visualize AI-driven insights.

  • Develop Python-based user-defined functions (UDFs) and stored procedures to enhance Snowflake processing.

  • Integrate generative AI models and Cortex AI capabilities into the Snowflake ecosystem to deliver intelligent, automated data products.

  • Implement real-time data streaming and ingestion pipelines to support AI workloads and analytics.

  • Leverage cloud platforms (AWS, Azure, GCP) for scalable data processing and seamless Snowflake integration.

  • Optimize Snowflake data models and query performance to support complex AI inference workloads.

  • Lead efforts in automating data workflows and model deployment pipelines with Python scripting and orchestration tools.

  • Ensure data governance, security, and role-based access control (RBAC) compliance across Snowflake deployments.

  • Develop prototypes and proof-of-concept applications leveraging Gen AI to demonstrate business value.

  • Collaborate with stakeholders to translate AI use cases into scalable data engineering solutions.

  • Mentor and lead engineering teams on best practices around Python, Snowflake, Gen AI, and Streamlit development.

  • Stay current with evolving AI technologies, Snowflake enhancements, and cloud data engineering trends to drive innovation.

  • Snowflake SnowPro Core Certification is must.

Roles and Responsibilities:

  • Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc.

  • Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data.

  • Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight.

  • Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system.

  • Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF)

  • Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage.

  • Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts.

  • Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python and Pyspark.

  • Should have some experience on Snowflake RBAC and data security.

  • Should have good experience in implementing CDC or SCD type-2.

  • Should have good experience in implementing Snowflake Best Practices

  • In-depth understanding of Data Warehouse, ETL concepts and Data Modelling

  • Experience in requirement gathering, analysis, designing, development, and deployment.

  • Should Have experience building data ingestion pipeline

  • Optimize and tune data pipelines for performance and scalability

  • Able to communicate with clients and lead team.

  • Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs.

  • Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo ,Github etc.

Qualifications we seek in you!

Minimum qualifications

  • B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Senior Snowflake Data Engineer.

  • Skill Metrix:

  • Snowflake, Cortex AI, Python/PySpark, AWS/Azure, ETL concepts, Data Modeling & Data Warehousing concepts


About Company

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI.

Job ID: 143943865