Key Responsibilities:
- Design and scalable Data Ingestion and integration solutions to process smooth onboarding of business data into TRD Data platform.
- Ensure data is fit for use by applying business and technical rules during its life cycle
- Look for automation opportunities or accelerators wherever possible.
- Build Data pipelines using Python CI/CD and transformation work flows to process or integrate various data sets from Ingestion to consumptions layers.
- Align with Solution Architects and vendors for best practices.
- Ensure Data Management practices are adopted towards data quality, Data modeling, harmonization, standards, Ontologies etc.
- Handles metadata effectively and leverage enterprise ontology management.
- Ensure FAIR data principles are adhered wherever applicable.
- Perform requirement scoping assessments to determine feasibility of projects.
- Highlight/identify gaps in existing functionality and review requirements with stakeholders.
- Develop a comprehensive requirement specification that will determine the estimate of cost, time and resources to deploy solutions.
- Liaise with the service development team to suggest a high level functional solution.
- Develop project estimates and complete financial model (costs, savings, revenue opportunities, investment horizon, etc.)
- Ensure that relevant stakeholders are involved in specification of new services and/or major upgrades to existing services.
- Ensure the overall user experience is taken into account when designing and deploying data solutions and services.
- Ensure implemented solutions are according to specifications and fit for purpose.
- Support end user trainings and self-service activities.
Essential Requirements
Education Qualifications
University degree in Informatics, Computer Sciences, Life Sciences or similar
Experience
- 5+ years of relevant experience in data Engineering, IT in Healthcare.
- Must have experience in designing and implementing ETL BI data products using primarily with Python and R
- Experience in building data transfer workflows and data error handling technics.
- Well versed with Data ingestion patterns in AWS cloud and Third party tools
- Working experience in handling Data objects in AWS cloud S3 Redbrick environment
- Require strong SQL query writing on relational DB like Oracle/MS SQL Server
- Good understanding of data consumption topics like Data science, Reports, Dashboards, KPIs.
- Working knowledge on ETL tools like Alteryx, BI tools like Power BI
- Good understanding metadata and data discovery technics
- Experience on AWS Cloud data integrations and data management technologies.
- Familiarized with data architecture and data engineering concepts such, data modelling, Data lakes, Data Analytics, etc.
- Strong analytical, critical thinking, and problem-solving skills
- working experience in Agile projects and methodology
- Excellent communication and working with business teams. Good understanding of Data analysis design and Data architecture concepts.
- Knowledge of GXP, and Agile project methodology
- Proficient skills in global teams, collaboration, facilitation, negotiation, working in a matrix environment and conflict resolution.