Job description
We're Nagarro.
We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at a scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 36 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in!
REQUIREMENTS:
- Expert knowledge in databases like PostgreSQL (preferably cloud-hosted in AWS, Azure, GCP), and Snowflake Data Warehouse with strong programming experience in SQL.
- Competence in data preparation and/or ETL tools to build and maintain data pipelines and flows.
- Expertise in Python and experience working on ML models.
- Deep knowledge of databases, stored procedures, and optimization of large data sets.
- In-depth knowledge of ingestion techniques, data cleaning, de-duplication, and partitioning.
- Understanding of index design and performance-tuning techniques.
- Familiarity with SQL security techniques such as data encryption at the column level, Transparent Data Encryption (TDE), signed stored procedures, and assignment of user permissions.
- Experience in understanding source data from various platforms and mapping them into Entity Relationship Models (ER) for data integration and reporting.
- Exposure to source control tools like GIT, Azure DevOps.
- Understanding of Agile methodologies (Scrum, Kanban).
- Experience with automated testing and coverage tools.
- Experience with CI/CD automation tools (desirable).
- Programming language experience in Golang (desirable).
RESPONSIBILITIES:
- Design and implement Snowflake-based data warehouse solutions.
- Develop and optimize complex SQL queries, stored procedures, and views in Snowflake.
- Build ETL/ELT data pipelines for efficient data processing.
- Work with structured and semi-structured data (JSON, Parquet, Avro) for data ingestion and processing.
- Implement data partitioning, clustering, and performance tuning strategies.
- Manage role-based access control (RBAC), security, and data governance in Snowflake.
- Integrate Snowflake with BI tools (Power BI, Tableau, Looker) for reporting and analytics.
- Create and maintain optimal data pipeline architecture.
- Assemble large, complex data sets that meet functional/non-functional business requirements.
- Build pipelines for optimal extraction, transformation, and loading of data from various sources using SQL and cloud database technologies.
- Prepare ML models for data analysis and prediction.
- Work with stakeholders including Executive, Product, Data, and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure data separation and security across national boundaries through multiple data centers and regions.
- Collaborate with data and analytics experts to enhance functionality in our data systems.
- Manage exploratory data analysis to support database and dashboard development.