Job description
- Knowledge Graph Enabling Data Intelligent Applications using Semantic Data Modelling - A data for Digital Twin and BIM.
- The knowledge graph represents a collection of interlinked descriptions of real-world entities like objects, events. The graph is processable by the machine as we'll as humans.
JOB REQUIREMENTS/ SKILLS:
- 5 - 8 years experience on developing software solutions with various Application programming languages.
- Experience in Research and Development processes in implementation of strategies, Development and developing protypes.
- Data Engineering and Analytics, Building Data Pipelines, Implementing algorithms in a distributed environment
- Create automated ETL pipelines for a wide variety of data sources.
- Develop Micro Service Architecture to build Self Service API s
- Integrate data with Analytics Tools
- Work on continuously improving code by refactoring, optimization
- Create technical enablers to the program increment
- Developing and deploying web applications on the cloud with solid understanding of one or more of the following like Flask, Django
- Should write Unit test cases and maintain code quality
- Work on Agile/Lean development methods using Scrum/Kanban
- Collaborate with team as we'll as with all the stake holders, highly proactive and team
TECHNICAL SKILLS:
- Mandatory and strong hands-on experience:
- Python, Python Libraries like Django/Flask, REST API/Microservices
- NoSQL or SQL Data base like MongoDB, PostgreSQL, SQLite,
- CI/CD platform like Jenkins, Docker & Kubernetes
- Strong in OOPS and Design Patterns concepts
- Data Structures and able to write all logical programming
- GIT, Jira (Agile)
- Unit Test case like PyTest etc
GOOD TO HAVE:
- AWS (Lambda, ECS, EC2, S3, SQS, Elastic Transcoder)
- Custom ETL workflow experience
- Python libraries like Numpy, Pandas etc
- UI like Angular 8+, HTML, CSS
- Knowledge in AI/ML