Job Description
What you get to do (your responsibilities)
- Develop high quality data processing infrastructure and scalable services that are capable of ingesting and transforming data at huge scale coming from many different sources on schedule.
- Turn ideas and concepts into carefully designed and well-authored quality code.
- Articulate the interdependencies and the impact of the design choices.
- Develop APIs to power data driven products and external APIs consumed by internal and external customers of the data platform.
- Collaborate with QA, product management, engineering, UX to achieve well groomed, predictable results.
- Improve and develop new engineering processes & tools.
- Fluidly adapt to changes and new requirements.
What you've probably done (your qualifications):
- 3+ Years experience working on UI projects using HTML, CSS, JavaScript
- Minimum 4+ year's experience working with ReactJS16.8
- 2+ years of experience with Node.js and npm
- 3+ Years experiencing developing UI screens which consume Rest APIs. Good understanding of API calls and parsing the results returned by API's
- Experience building REST based micro services in a distributed architecture along with any cloud technologies. (AWS preferred) - good to have
- Strong understanding of Web Programming
- Prefer experience with at least one state management library (e.g., Redux)
- Good knowledge of Jira, Git, Jenkins
- Development experience using Agile
- Hands on experience with performance improvements while loading a web page and calling API's
- Experience in object-oriented design and development with languages such as Java
- Knowledge in Java/J2EE frameworks like Spring Boot, JPA, JDBC and related frameworks are added advantage
- Experience with a variety of data stores for unstructured and columnar data as well as traditional database systems, for example, MySQL, Postgres
- Proven ability to deliver working solutions on time
- Strong analytical thinking to tackle challenging engineering problems.
- Great energy and enthusiasm with a positive, collaborative working style, clear communication and writing skills.
- Experience with working in DevOps environment you build it, you run it
Nice to have
- Built high throughput real-time and batch data processing pipelines usingSpark, Kafka, on AWS environment with AWS services like S3, Kinesis, Lamdba, RDS, DynamoDB or Redshift.
- Experience with big data technologies and exposure toHadoop, Spark, AWS Glue, AWS EMRetc