- Work as a member of a Data Platform Engineering team that uses Cloud and Big Data technologies to design, develop, implement and maintain solutions to support various functional areas like Manufacturing, Commercial, Research and Development.
- Work closely with the Enterprise Data Lake delivery and platform teams to ensure that the applications are aligned with the overall architectural and development guidelines
- Research and evaluate technical solutions including Databricks and AWS Services, NoSQL databases, Data Science packages, platforms and tools with a focus on enterprise deployment capabilities like security, scalability, reliability, maintainability, cost management etc.
- Assist in building and maintaining relationships with internal and external business stakeholders
- Develop basic understanding of core business problems and identify opportunities to use advanced analytics
- Assist in reviewing 3rd party providers for new feature/function/technical fit with departments data management needs.
- Work closely with the Enterprise Data Lake ecosystem leads to identify and evaluate emerging providers of data management & processing components that could be incorporated into data platform.
- Work with platform stakeholders to ensure effective cost observability and control mechanisms are in place for all aspects of data platform management.
- Experience developing in an Agile development environment, and comfortable with Agile terminology and ceremonies.
- Keen on embracing new responsibilities, facing challenges, and mastering new technologies
What we expect of youBasic Qualifications and Experience:
- Master s degree in computer science or engineering field and 1 to 3 years of relevant experience OR
- Bachelor s degree in computer science or engineering field and 3 to 5 years of relevant experience OR
- Diploma and Minimum of 8+ years of relevant work experience
Must-Have Skills:
- Experience with Databricks (or Snowflake), including cluster setup, execution, and tuning
- Experience with common data processing libraries: Pandas, PySpark, SQLAlchemy.
- Experience in UI frameworks (Angular.js or React.js)
- Experience with data lake, data fabric and data mesh concepts
- Experience with data modeling, performance tuning, and experience on relational databases
- Experience building ETL or ELT pipelines; Hands-on experience with SQL/NoSQL
- Program skills in one or more computer languages - SQL, Python, Java
- Experienced with software engineering best-practices, including but not limited to version control (Git, GitLab.), CI/CD (GitLab, Jenkins etc.), automated unit testing, and Dev Ops
- Exposure to Jira or Jira Align.
Good-to-Have Skills:
- Knowledge on R language will be considered an advantage
- Experience in Cloud technologies AWS preferred.
- Cloud Certifications -AWS, Databricks, Microsoft
- Familiarity with the use of AI for development productivity, such as Github Copilot, Databricks Assistant, Amazon Q Developer or equivalent.
- Knowledge of Agile and DevOps practices.
- Skills in disaster recovery planning.
- Familiarity with load testing tools (JMeter, Gatling).
- Basic understanding of AI/ML for monitoring.
- Knowledge of distributed systems and microservices.
- Data visualization skills (Tableau, Power BI).
- Strong communication and leadership skills.
- Understanding of compliance and auditing requirements.
Soft Skills:
- Excellent analytical and solve skills
- Excellent written and verbal communications skills (English) in translating technology content into business-language at various levels
- Ability to work effectively with global, virtual teams
- High degree of initiative and self-motivation
- Ability to manage multiple priorities successfully
- Team-oriented, with a focus on achieving team goals
- Strong problem-solving and analytical skills.
- Strong time and task leadership skills to estimate and successfully meet project timeline with ability to bring consistency and quality assurance across various projects.