Search by job, company or skills

JPMorganChase

Software Engineer II - Data Engineer - Spark, Python, Databricks or AWS EMR

new job description bg glownew job description bg glownew job description bg svg
  • Posted 11 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Description

You're ready to gain the skills and experience needed to grow within your role and advance your career and we have the perfect software engineering opportunity for you.

As a Software Engineer II - Data Engineer - Spark, Python, Databricks or AWS EMR at JPMorgan Chase within the Commercial & Investment Bank, you'll be a part of an agile team that works to enhance, design, and deliver the software components of the firm's state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role.

Job Responsibilities

  • Design, develop, and maintain scalable data pipelines and ETL processes.
  • Work with large datasets using Spark on Databricks or AWS EMR.
  • Write efficient SQL queries for data extraction, transformation, and analysis.
  • Collaborate with data scientists, analysts, and other engineering teams to deliver high-quality data solutions.
  • Implement data processing workflows on AWS services such as S3, ECS, Lambda, EMR, and Glue.
  • Develop and maintain Python scripts for data processing and automation.
  • Ensure data quality, integrity, and security across all data engineering activities.
  • Troubleshoot and resolve data-related issues in a timely manner.

Required Qualifications, Capabilities, And Skills

  • Formal training or certification on software engineering concepts and 2+ years applied experience
  • Proven expertise in Data Engineering with Spark.
  • Hands-on experience with Databricks or AWS EMR.
  • Strong knowledge of SQL and database concepts.
  • Experience in ETL and data processing workflows.
  • Proficiency in AWS services: S3, ECS, Lambda, EMR/Glue.
  • Advanced skills in Python programming.
  • Excellent problem-solving and analytical abilities.
  • Bachelor's degree in Computer Science, Information Technology, or related field (or equivalent experience).
  • Strong communication and collaboration skills.
  • Ability to work independently and as part of a team.

Preferred Qualifications, Capabilities, And Skills

  • Experience with Infrastructure as Code (IaaC) using Terraform or CloudFormation.
  • Familiarity with writing unit test cases for Python code.
  • Knowledge of version control systems such as BitBucket or GitHub.
  • Understanding of CI/CD pipelines and automation tools.

More Info

Job Type:
Industry:
Employment Type:

Job ID: 143090561