Search by job, company or skills

Oracle

Python + Pyspark - Data Engineering

new job description bg glownew job description bg glownew job description bg svg
  • Posted 2 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Data Bricks and Python Engineer

About Oracle FSGIU - Finergy:

Finergy division within Oracle FSGIU exclusively focuses on the Banking, Financial Services, and Insurance (BFSI) sector, offering deep domain knowledge to address complex financial needs. Finergy has Industry expertise in BFSI. (On Accelerated Implementation) Finergy has Proven methodologies that fast-track the deployment of multi-channel delivery platforms, minimizing IT intervention and reducing time to market. Due to Personalization tools that tailor customer experiences, Finergy has several loyal customers for over a decade. (On End-to-End Banking Solutions) Finergy Provides a single platform for a wide range of banking services-trade, treasury, cash management-enhancing operational efficiency with integrated dashboards and analytics.Finergy offers Expert Consulting Services, Comprehensive consulting support, from strategy development to solution implementation, ensuring the alignment of technology with business goals.


Job Responsibilities

1. Software Development:

- Design, develop, test, and deploy high-performance and scalable data solutions using Python,PySpark, SQL

- Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications.

- Implement efficient and maintainable code using best practices and coding standards.

2. Databricks Platform:

- Work with Databricks platform for big data processing and analytics.

- Develop and maintain ETL processes using Databricks notebooks.

- Implement and optimize data pipelines for data transformation and integration.

3. Continuous Learning:

- Stay updated on the latest industry trends, tools, and technologies related to Python, SQL, and Databricks.

- Share knowledge with the team and contribute to a culture of continuous improvement.

4. SQL Database Management:

- Utilize expertise in SQL to design, optimize, and maintain relational databases.

- Write complex SQL queries for data retrieval, manipulation, and analysis.

Mandatory Skills:

6 to 10 Years of experience in Databricks and big data frameworks

Advanced proficiency in AWS, including EC2, S3 and container orchestration (Docker, Kubernetes)

Proficient in AWS services and data migration

Experience in Unity Catalogue

Familiarity with Batch and real time processing

Data engineering with strong skills in Python, PySpark, SQL

Data Bricks and Python Engineer

About Oracle FSGIU - Finergy:

Finergy division within Oracle FSGIU exclusively focuses on the Banking, Financial Services, and Insurance (BFSI) sector, offering deep domain knowledge to address complex financial needs. Finergy has Industry expertise in BFSI. (On Accelerated Implementation) Finergy has Proven methodologies that fast-track the deployment of multi-channel delivery platforms, minimizing IT intervention and reducing time to market. Due to Personalization tools that tailor customer experiences, Finergy has several loyal customers for over a decade. (On End-to-End Banking Solutions) Finergy Provides a single platform for a wide range of banking services-trade, treasury, cash management-enhancing operational efficiency with integrated dashboards and analytics.Finergy offers Expert Consulting Services, Comprehensive consulting support, from strategy development to solution implementation, ensuring the alignment of technology with business goals.


Job Responsibilities

1. Software Development:

- Design, develop, test, and deploy high-performance and scalable data solutions using Python,PySpark, SQL

- Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications.

- Implement efficient and maintainable code using best practices and coding standards.

2. Databricks Platform:

- Work with Databricks platform for big data processing and analytics.

- Develop and maintain ETL processes using Databricks notebooks.

- Implement and optimize data pipelines for data transformation and integration.

3. Continuous Learning:

- Stay updated on the latest industry trends, tools, and technologies related to Python, SQL, and Databricks.

- Share knowledge with the team and contribute to a culture of continuous improvement.

4. SQL Database Management:

- Utilize expertise in SQL to design, optimize, and maintain relational databases.

- Write complex SQL queries for data retrieval, manipulation, and analysis.

Mandatory Skills:

6 to 10 Years of experience in Databricks and big data frameworks

Advanced proficiency in AWS, including EC2, S3 and container orchestration (Docker, Kubernetes)

Proficient in AWS services and data migration

Experience in Unity Catalogue

Familiarity with Batch and real time processing

Data engineering with strong skills in Python, PySpark, SQL

Career Level - IC3

About Company

Oracle Corporation is an American multinational computer technology corporation headquartered in Austin, Texas.In 2020, Oracle was the second-largest software company in the world by revenue and market capitalization.The company sells database software and technology (particularly its own brands), cloud engineered systems, and enterprise software products, such as enterprise resource planning (ERP) software, human capital management (HCM) software, customer relationship management (CRM) software (also known as customer experience), enterprise performance management (EPM) software, and supply chain management (SCM) software.

Job ID: 142069541