Search by job, company or skills

A

Data Engineer

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 14 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Project Role : Data Engineer

Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.

Must have skills : AWS BigData

Good to have skills : NA

Minimum 3 Year(s) Of Experience Is Required

Educational Qualification : 15 years full time education

Summary:

As a Custom Software Engineer, you will develop custom software solutions to design, code, and enhance components across systems or applications. Your typical day will involve collaborating with cross-functional teams to understand business requirements, utilizing modern frameworks and agile practices to deliver scalable and high-performing solutions tailored to specific business needs. You will engage in problem-solving activities, ensuring that the software solutions meet the highest standards of quality and performance while adapting to evolving project requirements.

Roles & Responsibilities:

  • Strong have strong experience in Data Engineering, ETL process, Data quality process
  • Should have hands on experience in Python/Pyspark and SQL
  • Should have hands on experience in AWS services EMR, EC2, S3, Glue etc. for ETL process
  • Expected to perform independently and become an SME.
  • Required active participation/contribution in team discussions.
  • Contribute in providing solutions to work related problems.
  • Collaborate with stakeholders to gather and analyze requirements for software development.
  • Implement best practices in software development to ensure high-quality deliverables.
  • Shift time will be 1 Pm to 10:30 PM
  • Candidate should have good communication skills.

Professional & Technical Skills:

  • Must Have Skills: Proficiency in AWS BigData.
  • Strong understanding of cloud computing concepts and services.
  • Experience with data processing frameworks such as Apache Spark or Hadoop.
  • Familiarity with database technologies, including SQL and NoSQL databases.
  • Knowledge of data warehousing solutions and ETL processes.

Additional Information:

  • The candidate should have minimum 3 years of experience in AWS BigData.
  • This position is based at our Bengaluru office.
  • A 15 years full time education is required.

, 15 years full time education



More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147218327

Similar Jobs

Bengaluru, India

Skills:

snowflake Apache AirflowData ModellingPythonSqlAWS – Glue Lambda Step FunctionsdbtCI CD

Bengaluru, India

Skills:

HadoopEtl DevelopmentPysparkSparkData ModelingSqlworkflow orchestrationInformatica BDMBig Data Management

Bengaluru, India

Skills:

HadoopCassandraApache SparkGrafanaCosmosSqlApache AirflowNosqlJenkinsSpark StreamingHivePrestoSplunkHudiGitHub ActionsTrinoGCP BigQuery

Bengaluru, India

Skills:

JavaBigQueryScalaPostgreSQLAWS GlueKafkaHBaseRedshiftSqlApache AirflowMySQLSparkOoziePythonClickHouseLuigiFlink

Bengaluru, India

Skills:

ServicenowJenkinsOracleJIRASQL developerControl-MDelphix Data masking