Search by job, company or skills

Senior Data Engineer

Uplers

    Highlights

    Job Description

    More Info

    Recruiter Info

4-7 Years
a month ago
96 Viewed
13 Applied

Job Description

Senior Data Engineer (Remote)
  • Experience: 4+ years
  • Expected Notice Period: 2 to 4 Weeks
  • Shift: 9:00AM to 6:00PM IST
  • Opportunity Type: Remote
  • Placement Type: Contractual
  • Contract Duration: Full-Time, 12 Months
  • (Note: This is a requirement for one of Uplers Partners)
  • What do you need for this opportunity
  • Must have skills required :

Apache Airflow, AWS / GCP, Data Pipeline, Databricks, Databricks Marketplace, DBTETL, Snowflake, Snowflake Marketplace, Python
  • Good to have skills :

Big Data, Excellent Communcations Skills, PostgreSQL
  • Our Hiring Partner is Looking for:

Senior Data Engineer (Remote) who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you.
  • Role Overview Description
  • Experience & Requirements:
4-7 years of experience as Senior Data Engineer
  • Database and Data Warehousing: Advanced experience with Snowflake, Databricks, and other data warehousing solutions.
Data Transformation and ETL: Proficiency in DBT and experience in building and maintaining ETL pipelines.
  • Cloud Platforms: Extensive experience with AWS and GCP,
Programming Languages: Strong proficiency in Python programming language.
  • Experience designing and implementing scalable and efficient data pipelines using Python and relevant libraries (such as Pandas, NumPy, SQLAlchemy, etc.).
Solid understanding of ETL principles, data modeling, and schema design.
  • Big Data Technologies: Experience with Apache Spark, Kafka, and other big data technologies.
Data Integration and Automation: Familiarity with data integration and automation technologies (such as Airbyte, Apache Airflow, Luigi, etc.).
  • DevOps and CI/CD: Familiarity with DevOps practices and tools like Jenkins, Docker, and Kubernetes.
Data Governance: Knowledge of Data Governance frameworks and practices, including data security and privacy regulations.
  • Experience in Startups: Proven experience working with data startups and being part of the product development team.
Analytical Skills: Strong analytical and problem-solving skills, with the ability to troubleshoot data-related issues.
  • Communication and Collaboration: Excellent communication and collaboration skills.
Attention to Detail: Attention to detail and a strong focus on data quality and integrity.
  • Required Skills & Mindsets:
Strong foundation in software engineering best practices, including design patterns and modular architecture.
  • Excellent analytical and problem-solving skills, with the ability to work independently and within a team.
Strong work ethic, supported by solid time-management skills, and a professional, team-oriented attitude.
  • Commitment to high-quality software development, with a focus on testing and code quality.
Fast learner, enthusiastic about tackling new challenges and technologies.
  • Must Have Skills:
DBT (data build tool)
  • Python
AWS / GCP
  • Snowflake
Databricks
  • Snowflake Marketplace
Databricks Marketplace
  • What You Will Be Doing:
Design, develop, and optimize data pipelines and architectures to support our composable CDP product.
  • Implement and maintain ETL processes using dbt.
Work with large datasets and develop scalable data models on Snowflake.
  • Utilize Databricks for big data processing and machine learning workflows.
Manage and optimize cloud infrastructure on AWS and GCP to ensure high performance and availability.
  • Collaborate with cross-functional teams to understand data requirements and deliver solutions.
Ensure data quality, governance, and compliance across all data pipelines.
  • Implement best practices for data management, including data security and privacy.
Lead and mentor junior data engineers, fostering a culture of continuous learning and improvement.
  • Actively participate in the product development process as part of the core team.
Publish product updates on Snowflake and Databricks marketplace.
  • Engagement Type:
Job Type: Contract
  • Location: Remote
Working time: 9:00 AM and 6:00 PM IST
  • Interview Process: 2 Rounds
1- Coding Test
  • 2- Technical Interview
  • How to apply for this opportunity
Register or login on our portal
  • Click Apply, upload your resume and fill in the required details.
Post this click Apply Now to submit your application.
  • Get matched and crack a quick interview with our hiring partner.
Land your global dream job and get your exciting career started!
  • About Uplers:

Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. You will also be assigned to a dedicated Talent Success Coach during the engagement.

(Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well).

So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Job Type: Full-time

Pay: 247,
  • 00 - 309,500.00 per month

    Benefits:
  • Work from home

Work Location: Remote
INDUSTRY
SKILLS
Follow
Save
Report

Similar Jobs

Senior Data Engineer

Company Name Confidential

Senior Data Engineer

Company Name Confidential
Last Updated: 08-10-2024 09:11:34 AM
Home Jobs in Remote Senior Data Engineer