Search by job, company or skills

Fractal

AWS -Data Engineer

new job description bg glownew job description bg glownew job description bg svg
  • Posted 22 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

It's fun to work in a company where people truly BELIEVE in what they are doing!

We're committed to bringing passion and customer focus to the business.

AWS Data Engineer at Fractal.ai

Fractal is one of the most prominent players in the Artificial Intelligence space.

Fractal's mission is to power every human decision in the enterprise and brings AI, engineering, and design to help the world's most admired Fortune 500 companies.

Fractal has more than 3,000 employees across 16 global locations, including the United States, UK, Ukraine, India, Singapore, and Australia. Fractal has consistently been rated as India's best companies to work for, by The Great Place to Work Institute, featured as a leader in Customer Analytics Service

Providers Wave 2021, Computer Vision Consultancies Wave 2020 & Specialized Insights Service Providers Wave 2020 by Forrester Research and recognized as an Honorable Vendor in 2021 Magic Quadrant for data & analytics by Gartner.

Experience: 4 to 14 years

Location: Mumbai / Bengaluru / Gurgaon/Chennai/Pune

Competencies

  • Passion for educating, training, designing, and building end-to-end systems for a diverse and challenging set of customers to success.
  • Characteristics of a forward thinker and self-starter that flourishes with new challenges and adapts quickly to learning new knowledge.
  • Ability to work with a global team of consulting professionals across multiple projects
  • Knack for helping an organization to understand application architectures and integration approaches, to architect advanced cloud-based solutions, and to help launch the build-out of those system.

Roles & Responsibilities


  • 2 or more years of hands-on experience on the AWS Services, especially AWS Glue.
  • Strong hands-on experience in AWS Glue, automation build-out, integration with other AWS services, using SPARK .
  • Hands-on administrating skill required with AWS Managed Kafka configuration for creating topics, consumer and producer control.
  • Need to review and decompose the existing Java application to be replaced with either a Mule service built by the Mule squad, or Lambda service in Python within the Operational Data squad.
  • Ideally suited to someone who has worked on Java development in past.
  • Working experience with: AWS Athena and Glue Pyspark, EMR, DynamoDB, AWS S3, Redshift, Kinesis, Lambda, Apache Spark, Databricks on AWS, Snowflake on AWS
  • Work across all phases of SDLC and use Software Engineering principles to build scaled solutions.
  • Be an integral part of large-scale client business development and delivery engagements.

Education: -A bachelor's degree in Computer Science or related field with technology experience

If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!

Not the right fit Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 141171389

Similar Jobs