Bachelor's or master's degree in computer science, Engineering, or a related field.
Strong programming skills in languages such as Python, PySpark, SQL etc.
Experience in Build and optimize ETL workflows using tools/technologies such as Spark, Snowflake, Airflow and/or Azure Data factory, Glue, Redshift etc.
Craft and optimize complex SQL queries and stored procedures for data transformation, aggregation, and analysis.
Develop and maintain data models ensuring scalability and optimal performance.
Utilize Snowpark for data processing within the Snowflake platform.
Integrate Snowflake for efficient data storage and retrieval.
Exposure to API integrations to facilitate data workflows.
Experience in implementing CI-CD pipelines through DevOps platforms.
Good experience in cloud infrastructure such as Azure, AWS or GCP
Good to have
Experience in Docker, Kubernetes etc.
Exposure in HTML, CSS, Javascript/JQuery, Node.js, Angular/React.
Experience in API development, Flask/Django is a bonus