KEY RESPONSIBILITIES
- 14 to 15 years Experience of working in data analytical ETL Design & Development projects
- Expertise in tools like Informatica, Talend, SSIS, DataStage, and custom Python-based ETL frameworks
- Must know Programming skills like ETL using Hadoop, Python, SQL, Shell scripting, PySpark, Scoop, Scala
- Data Architecture experience in Data modelling (OLTP & OLAP), data lake and warehouse architecture
- Expertise in the Big Data Ecosystem like Hive, Spark, HDFS, or working knowledge of Hadoop-based environments (Snowflake, Redshift, Azure Synapse, BigQuery)
- Experience in Cloud Platforms like AWS (Glue, Redshift, S3), Azure (Data Factory, Synapse, Blob), GCP (BigQuery, Dataflow)
- Good to know Data Integration & Orchestration systems like Airflow, Apache NiFi, DBT, Kafka, REST APIs
- Design and implementation of enterprise data warehouses and data marts
- Advance expertise in SQL tuning, ETL performance tuning, partitioning strategies, parallel processing for optimization
SKILLS AND EXPERIENCE
ETL Design
Informatica, Talend, SSIS, Data Stage, Python Based ETL framework
Hadoop, Python, SQL, Shell scripting, PySpark, Scoop, Scala