Work with business and technical leadership to understand requirements.
Design to the requirements and document the designs
Ability to write product-grade performant code for data extraction, transformations and loading using Spark, Py-Spark, Python
Write referenceable & modular code
Write performant queries using Teradata SQL, Hive SQL and Spark SQL against Teradata and Hive
Implementing dev-ops pipelines to deploy code artifacts on to the designated platform/servers like AWS.
Troubleshooting the issues, providing effective solutions and jobs monitoring in the production environment
Have a quality mindset, not just code quality but also to ensure ongoing data quality by monitoring data to identify problems before they have business impact