Experience: 4 - 7 Years
Location: Whitefield - Bangalore
Work Mode: Hybrid
Mandatory Skills: Python, PySpark, AWS, SQL, DSA, Databricks,Hadoop,spark,mapreduce,HBase.
Responsibilities: Good development practices
- Hands on coder with good experience in programming languages like
Python or Pyspark .
- Hands-on experience on the Big Data stack like Hadoop, Mapreduce,
Spark, Hbase, and ElasticSearch.
- Good understanding of programming principles and development
practices like checkin policy, unit testing, code deployment
- Self starter to be able to grasp new concepts and technology and
translate them into large scale engineering developments
- Excellent experience in Application development and support,
integration development and data management.
- Hands on coder with good understanding on enterprise level code
- Design and implement APIs, abstractions and integration patterns to
solve challenging distributed computing problems
- Experience in defining technical requirements, data extraction, data
transformation, automating jobs, productionizing jobs, and exploring
new big data technologies within a Parallel Processing environment
Culture
- Must be a strategic thinker with the ability to think unconventional
out:of:box.
- Analytical and data driven orientation.
- Raw intellect, talent and energy are critical.
- Entrepreneurial and Agile understands the demands of a private, high
growth company.
- Ability to be both a leader and hands on doer.
Qualifications
Years of track record of relevant work experience and a computer Science or
related technical discipline is required
Experience with functional and object-oriented programming Python or
Pyspark is a must.
Hands-on experience on the Big Data stack like Hadoop, Mapreduce, Spark,
Hbase, and ElasticSearch.
Good understanding on AWS services and experienced in working with
API's, microservices.
Effective communication skills (both written and verbal
Ability to collaborate with a diverse set of engineers, data scientists and
product managers
Comfort in a fast-paced start-up environment
Preferred Qualification
Experience in agile methodology
Experience with database modeling and development, data mining and
warehousing.
Experience in architecture and delivery of Enterprise scale applications and
capable in developing framework, design patterns etc. Should be able to
understand and tackle technical challenges, propose comprehensive solutions
and guide junior staff
Experience working with large, complex data sets from a variety of sources
Skills: o,data,python,big data,hbase,hadoop,mapreduce,spark,aws,agile,pyspark,data structures,algorithms,data bricks,api