companylogo

Software Engineer/Data Engineer Intern

TopCrop

Internship

doticon

In-office

doticon

1 Openings

doticon

Starting on 18th September

Job Description

  1. We are seeking a proactive and driven Software Engineer/Data Engineer Intern to join our team. In this role, you will have the exciting opportunity to work on a critical project involving the migration of our technology stack from AWS, MongoDB, and Spark to Snowflake and dbt.
  2.  
  3. 1. Migration Planning: Collaborate with the engineering team to plan and strategize the migration process, ensuring minimal disruption to ongoing operations.
  4. 2. ETL Development: Develop Extract, Transform, Load (ETL) processes using dbt to enable efficient data processing and analytics.
  5. 3. Testing and Quality Assurance: Conduct thorough testing and quality assurance of the migrated stack to identify and address any issues.

Candidate Preference

- Currently pursuing a degree in Computer Science, Data Engineering, or a related field. - Proficiency in programming languages such as Python, Java, or Scala. - Knowledge of databases like MongoDB, Snowflake, and Spark is a plus. - Willingness to learn and adapt to new technologies. What We Offer: - Exposure to cutting-edge technologies in the data and analytics space. - Opportunity to work closely with experienced data engineers and software developers.

Job Overview

IconDate

Date Posted:

Posted 20 days ago

IconLocation

Location:

Bhubaneswar

IconSalary

Stipend:

INR 10,000 /month

Job Category

Technology

Job Skills

Python

Snow Flake

AWS

Internship

IconTime

20 days ago

Software Engineer/Data Engineer Intern

TopCrop
IconLocation

Bhubaneswar

IconRupee

10,000 /month

IconApplicant

247 applicants

iconbuilding

Technology

Skills

Python

Snow Flake

AWS

companyLogoUrl

dot

In-office

dot

1 Openings

dot

Starting on 18th September


Job Description

  1. We are seeking a proactive and driven Software Engineer/Data Engineer Intern to join our team. In this role, you will have the exciting opportunity to work on a critical project involving the migration of our technology stack from AWS, MongoDB, and Spark to Snowflake and dbt.
  2.  
  3. 1. Migration Planning: Collaborate with the engineering team to plan and strategize the migration process, ensuring minimal disruption to ongoing operations.
  4. 2. ETL Development: Develop Extract, Transform, Load (ETL) processes using dbt to enable efficient data processing and analytics.
  5. 3. Testing and Quality Assurance: Conduct thorough testing and quality assurance of the migrated stack to identify and address any issues.

Candidate Preferences:

- Currently pursuing a degree in Computer Science, Data Engineering, or a related field. - Proficiency in programming languages such as Python, Java, or Scala. - Knowledge of databases like MongoDB, Snowflake, and Spark is a plus. - Willingness to learn and adapt to new technologies. What We Offer: - Exposure to cutting-edge technologies in the data and analytics space. - Opportunity to work closely with experienced data engineers and software developers.

;