- Design, develop, and implement data pipelines, ETL/ELT processes, and data integration solutions
- Contribute to data pipeline projects from inception to deployment, manage scope, timelines, and risks
- Contribute to data models for biopharma scientific data, data dictionaries, and other documentation to ensure data accuracy and consistency
- Optimize large datasets for query performance
- Collaborate with global cross-functional teams including research scientists to understand data requirements and design solutions that meet business needs
- Implement data security and privacy measures to protect sensitive data
- Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions
- Collaborate with Data Architects, Business SMEs, Software Engineers and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions
- Identify and resolve data-related challenges
- Adhere to best practices for coding, testing, and designing reusable code/component
- Explore new tools and technologies that will help to improve ETL platform performance
- Participate in sprint planning meetings and provide estimations on technical implementation
- Maintain documentation of processes, systems, and solutions
What we expect of you
We are all different, yet we all use our unique contributions to serve patients.
Basic Qualifications:
- Masters degree and 1 to 3 years of Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field experience OR
- Bachelors degree and 3 to 5 years of Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field experience OR
- Diploma and 7 to 9 years of Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field experience
Preferred Qualifications:
- 1+ years of experience in implementing and supporting custom software development for drug discovery
Functional Skills:
Must-Have Skills:
- Proficient in General Purpose High Level Language (e.g. NodeJS/Koa, Python, Java, C#.NET)
- Proficient in Javascript UI Framework (e.g. React or ExtJs)
- Proficient in SQL (e.g. Oracle, PostGres or Databricks)
- Experience in automated testing tools and frameworks (e.g. Jest, Playwright, Cypress or Selenium)
Good-to-Have Skills:
- Strong understanding of cloud platforms (e.g AWS) and containerization technologies (e.g., Docker, Kubernetes)
- Working experience with DevOps practices and CI/CD pipelines
- Experience with big data technologies (e.g., Spark, Databricks)
- Experienced with API integration, serverless, microservices architecture
- Experience with monitoring and logging tools (e.g., Prometheus, Grafana, Splunk)
- Experience of infrastructure as code (IaC) tools (Terraform, CloudFormation)
- Experience with version control systems like Git
- Strong understanding of software development methodologies, mainly Agile and Scrum
- Strong problem solving, analytical skills; Ability to learn quickly & work independently; Excellent communication and interpersonal skills
Professional Certifications
AWS Certified Cloud Practitioner preferred
Soft Skills:
- Excellent analytical and troubleshooting skills
- Strong verbal and written communication skills
- Ability to work effectively with global, virtual teams
- High degree of initiative and self-motivation
- Ability to manage multiple priorities successfully
- Team-oriented, with a focus on achieving team goals
- Strong presentation and public speaking skills
Soft Skills:
- Excellent critical-thinking and problem-solving skills
- Strong communication and collaboration skills
- Demonstrated awareness of how to function in a team setting
- Demonstrated presentation skills