Search by job, company or skills

  • Posted 15 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Position Requirement:

4+ years of experience using Data Integration Tools - Pentaho Or any other ETL/ELT tools.

4+ years of experience using traditional databases like Postgres, MSSQL, Oracle

1+ years of experience using Columnar databases like Vertica, Google BigQuery, Amazon Redshift

1+ years of experience in Scheduler/Orchestration Tools Like Control-M, Autosys, Airflow, JAMS

Good conceptual knowledge on ETL/ELT Strategies.

Good conceptual knowledge in any Code Versioning Tools

Good collaboration, communication and documentation skills.

Experience of working in Agile Delivery Model.

Requires minimal or no direct supervision

Good knowledge in Data Visualization Tools like Tableau, Pentaho BA Tools.

Digital Marketing/Web analytics or Business Intelligence a plus.

Knowledge of scripting languages such as Python.

Experience in the Linux environment is preferred but not mandatory.

Roles & Responsibilities:

Develop & Support multiple Data Engineering projects with heterogeneous data sources,

produce/consume data to/from messaging queues like Kafka, push/pull data to/from REST API's.

Support in-house build Data Integration Framework, Data Replication Framework, Data Profiling &

Reconciliation Framework.

Develop Data Pipelines with good coding standards, unit testing with detailed test cases.

Willingness to learn new technologies.

Qualification: B.E. Computer Science/IT degree (or any other engineering discipline)

Work Timings: 2 PM to 11 PM IST

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 139205007

Similar Jobs