Search by job, company or skills

D

Java + Spark

Save
new job description bg glownew job description bg glow
  • Posted 7 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

The team

Deloitte's Technology & Transformation practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning.

Your work profile

  • We are looking for a skilled Java + Spark Developer with experience in building scalable data processing applications.
  • The ideal candidate should have strong expertise in Java development along with hands-on experience in big data technologies like Apache Spark.
  • Develop and maintain scalable data processing pipelines using Apache Spark
  • Write clean, efficient, and reusable code in Java
  • Design and implement batch and real-time data processing solutions
  • Work with distributed data systems like Hadoop and Hive
  • Optimize Spark jobs for performance and scalability
  • Collaborate with data engineers, analysts, and cross-functional teams
  • Debug, troubleshoot, and enhance existing applications
  • Ensure code quality through unit testing and code reviews

Key skills required:

  • Exp: 6-9 years
  • Strong experience in Java (Java 8 or above preferred)
  • Hands-on experience with Apache Spark (Core, SQL, DataFrames, Streaming)
  • Knowledge of big data ecosystem: Hadoop, Hive, Kafka
  • Experience with RESTful APIs and microservices architecture
  • Familiarity with databases (SQL/NoSQL like MySQL, MongoDB)
  • Understanding of distributed computing concepts
  • Experience with build tools like Maven/Gradle

Good to Have

  • Experience with cloud platforms like Amazon Web Services or Microsoft Azure
  • Knowledge of containerization tools like Docker and Kubernetes
  • Exposure to CI/CD pipelines
  • Experience with Scala (bonus)

Education:

  • Bachelor's degree in computer science, Engineering, or related field
  • 2-6 years of relevant experience (can be adjusted as per role)

Soft Skills

  • Strong problem-solving skills
  • Good communication and teamwork
  • Ability to work in an agile environment

Location and Way of Working:

  • Base location: Bengaluru, Pune, Chennai

More Info

Job Type:
Industry:
Function:
Employment Type:

About Company

Job ID: 147463675

Similar Jobs

Bengaluru, India

Skills:

containerization JavaTest AutomationDistributed SystemsApache SparkSpring BootDatadogNew RelicDockerBackend EngineeringApache KafkaRestful ApisHelmKubernetesClickHouseMicronautSQL DatabasesGraphQL APIsNoSQL DatabasesMicroservices ArchitectureOrchestration ToolsSwagger OpenAPI

Bengaluru

Skills:

JavaUnixJunitKafkaJiraSqlJenkinsGitLinuxSparkCucumberPythonAWSETL processes

Bengaluru, Chennai, Pune

Skills:

NosqlBig DataJava Language

Bengaluru, India

Skills:

graph databases SeleniumScrumAgileJavaApache SparkRdbms ConceptsAWSKubernetesPythonAzureJUnitGcpDockerElasticsearchGitData LakehouseDistributed storage systemsSalesforce Data CloudCI CD pipelines

Bengaluru, India

Skills:

solace Spring BootJavaOpenshiftMicroservicesApache SparkReactjsSaasJ2EEPaasKafkaIaasAngularOcpOracleAzureSybaseJavascriptRestful Apissoftware architecture design patternsNetApp S3microservices orchestrationApache IcebergCloud methodologies