Job Description
Skills:
Matillion ETL, Azure, Snowflake Cloud, Data Pipelines, Apache Airflow, AWS, Extract, Transform, Load (ETL), Microsoft Power BI,
Greetings from Colan Infotech!!
Job Title - Data Engineer
Experience - 5+ Years
Job Location - Bangalore
Notice Period - Immediate to 15 Days
Kindly find the below JD ,If interested share your updated resume to [Confidential Information]
Job Descriptio
The Data Engineerisleadingtheend-to-endtechnical design and implementation of modern,AI enableddata solutions and independently drives architecture decisions, builds and optimizes workflows, and maintains production systems that enabledata drivendecisions across the enterprise; while partnering with the platform team to influence the technology stack and deliver enhancements, they operate with ownership to set standards and automate best practices.
This position reports totheAI & Data Infrastructure Leadand is part of theDigital Products Development Group. The role can be performed remotely from Poland or India.
In This Role, You Will Have The Opportunity To
Lead an executionof complex data solutions for high-impact projects, driving technical decisions and ensuring best practices across critical initiatives.
Design and implementdata pipelines and workflows using modern tools and frameworks, including Snowflake, Apache Airflow, DBT,Matillion, etc.
Build andmaintaincloud-native datasolutions,leveragingAzure and AWS services to ensure scalability and reliability.
Contributetoplatform reliability, cost efficiency, and security by actively engaging in design discussions, executing proof-of-concepts (POCs), automating processes, and mentoring team members to ensure adherence to best practices.
Collaborate withdata architects, analysts,data scientists, ML engineersand business stakeholders to deliver high-quality, governed dataand MLsolutions that supportdatadrivendecisions.
The Essential Requirements Of The Job Include
Collaboration- Strong communication skills to work effectively with data architects, analysts, and cross-functional teams.
Problem Solving- Ability to troubleshoot complex data pipeline issues andoptimizeworkflows for scalability and reliability.
Cloud Expertise- Hands-on experience with major cloud platforms (Azure preferred; AWS or GCP is a plus), including data services like Snowflake.
Data Modeling & Optimization-strongknowledge of oneof RDBMS like Snowflake or similar,expertisein designing efficient schemas andoptimizingqueries for performance and cost.
Data Pipeline Orchestration-proficiencywith workflow tools such as Apache Airflow for scheduling, monitoring, and managing ETL processes.
Data Transformation Frameworks- Strongknowledge ofoneofdata processing tools like DBTand other transformation tools for version-controlled, modular data modeling.
ETL/ELT Tools- Experience with tools likeMatillionfor building scalable data integration pipelines.
It Would Be a Plus If You Alsopossesspreviousexperience In
Visualizationexperience in building reports and dashboardsusing toolslikePower BIor similar.
Knowledge of Data Science and AI concepts, with experience supporting ML workflows through feature datasets and containerized deployments using Docker, Fargate, or similar tools.