Job description
What will you do
You will play a key role in strengthening our data engineering capabilities delivering data engineering pipelines and solutions through Enterprise Data Platform (EDP) for various groups and divisions.
General Job Functions
- Collaborate with functional analysts to convert the requirements into data engineering pipelines.
- Collaborate with the scrum master on product backlogs and helps with sprint planning.
- Build, test, and optimize data pipelines for various use-cases, including real-time and batch processing, based on specific requirements.
- Supports the evolution of EDP architecture and takes part in roadmap activities around data platform architecture initiatives or changes.
- Collaborate with leadership and partners to ensure data quality and integrity in DWH & AWS platforms for BI/Analytical reporting.
- Provide hands-on mentorship and oversight for a group of projects.
- Identify potential risks in advance and communicate successfully with partners to develop and implement risk mitigation plans.
- Actively support development activities in data engineering, ensuring bandwidth is available when needed.
- Implement and follow agile methodologies to deliver solutions and product features, adhering to DevOps practices.
- Ensure the teams follows the prescribed development processes and approaches.
Must have skills and experience
- 10+ years of overall work experience with 7+ years exclusively in delivering data solutions.
- 5+ years of validated experience building Cloud BI solutions using AWS.
- Experience with agile development methodologies by following DevOps, Data Ops and Dev Sec Ops practices.
- 5+ years of programming in SQL, Pyspark and Python.
- Excellent written, verbal and interpersonal and partner communication skills.
- Excellent Analysis and business requirement documentation skills.
- Ability to work with multi-functional teams from multiple regions/ time zones by efficiently demonstrating multi-form communication (Email, MS Teams for voice and chat, meetings)
- Excellent prioritization and problem-solving skills.
Good to have skills
- Hands on experience with Snowflake or Azure data engineering.
- Knowledge of SQL and NoSQL databases like PostgreSQL, MySQL, MongoDB, Cassandra.
- Experience in building data pipelines in Databricks.
- Data visualization experience using tools such as Power BI or Tableau.
- Knowledge of data governance practices, data quality, and data security.
- Relevant certifications in on cloud platforms.
- Basic understanding of machine learning and Generative AI concepts.