Search by job, company or skills

Niveus Solutions

Cloud Specialist - Data Bricks + Python

Fresher
new job description bg glownew job description bg glownew job description bg svg
  • Posted 17 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Description:
We are seeking a skilled Backend Data Engineer to join our dynamic team. The ideal candidate will be responsible for designing and implementing data pipelines, ensuring data integrity, and developing backend services that support data processing and analytics. You will work closely with data scientists, analysts, and other engineers to manage and process large datasets, utilizing Data Bricks and various Python frameworks. Your role will involve optimizing data workflows and ensuring that our data infrastructure is robust, scalable, and efficient.

Key Responsibilities:
- Design, develop, and maintain data pipelines using Data Bricks.
- Collaborate with cross-functional teams to gather requirements and deliver high-quality data solutions.
- Implement ETL processes to extract, transform, and load data from various sources into data lakes and warehouses.
- Write clean, efficient, and well-documented Python code for data processing.
- Optimize data models and queries for performance and scalability.
- Monitor and troubleshoot data pipeline performance issues.
- Ensure data quality and integrity throughout all stages of data processing.
- Stay updated with emerging technologies and best practices in data engineering.

Skills Required:
- Proficiency in Python programming and data manipulation libraries such as Pandas and NumPy.
- Experience with Data Bricks and Apache Spark for big data processing.
- Strong understanding of data warehousing concepts and ETL processes.
- Familiarity with SQL for querying relational databases.
- Knowledge of data modeling techniques and practices.
- Experience with cloud platforms such as AWS, Azure, or Google Cloud.
- Excellent problem-solving skills and attention to detail.
- Strong communication skills and ability to work collaboratively in a team environment.

Tools Required:
- Data Bricks for data engineering tasks.
- Python for backend development and data processing.
- Apache Spark for handling large-scale data processing.
- SQL databases such as PostgreSQL, MySQL, or similar.
- Cloud services (AWS, Azure, or Google Cloud) for data storage and processing.
- Git for version control and collaboration.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 134147347