Search by job, company or skills

Angel and Genie

ETL Developer

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted a day ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Title: Talend ETL Developer (5 - 6 Years Experience)

  1. Immediate Joinee
  2. 3 levels of interview (Level 3 in person Mandatory)
  3. 5 to 6yrs of exp
  4. Budget - 15LPA
  5. Hybrid Mode - Mon to Thursday(Mandatory)
  6. Location - Hyderabad

Job Summary

We are looking for a skilled Talend ETL Developer with 5-6 years of experience in

designing, developing, and maintaining data integration solutions. The ideal candidate should

have strong expertise in Talend, data warehousing concepts, and ETL processes, with the

ability to work in a fast-paced environment and collaborate with cross-functional teams.

Key Responsibilities

• Design, develop, and deploy ETL workflows using Talend Open Studio / Talend

Data Integration

• Build and optimize data pipelines for large-scale data processing

• Extract, transform, and load data from multiple sources (databases, APIs, flat files,

cloud sources)

• Ensure data quality, integrity, and consistency across systems

• Perform data mapping, transformation, and validation

• Troubleshoot ETL job failures and performance issues

• Collaborate with Data Analysts, Data Engineers, and Business teams to understand

requirements

• Implement error handling, logging, and monitoring mechanisms

• Optimize SQL queries and ETL jobs for performance tuning

• Maintain proper documentation of ETL processes and workflows

• Support production deployments and provide L2/L3 support when required

Required Skills & Qualifications

• 5 - 6 years of experience in ETL development

• Strong hands-on experience with Talend (DI / Big Data / Cloud)

• Proficiency in SQL (advanced queries, joins, performance tuning)

• Experience with RDBMS (Oracle, SQL Server, MySQL, PostgreSQL)

• Good understanding of data warehousing concepts (star schema, snowflake schema)

• Experience in handling large datasets and performance optimization

• Knowledge of Unix/Linux scripting

• Familiarity with version control systems (Git, SVN)

• Strong debugging and problem-solving skills

• Good communication and teamwork abilities

• Knowledge of CI/CD pipelines

• Exposure to data governance and data quality tools

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147167817

Similar Jobs

Hyderabad, Chennai, Pune

Skills:

OracleData WarehousingEtl DevelopmentSqlApplication DevelopmentTime Management

Hyderabad, India

Skills:

SqlData Warehousing ConceptsBatch processing and scheduling toolsData profiling and reconciliation

Hyderabad, India

Skills:

Real-time and batch data integration of Salesforce to SnowflakeData migration and ETL ELT processesAPI-based integrations and data orchestrationSnowflake data warehousingStrong SQL and data modeling skillsData pipeline development using cloud platformsPerformance tuning and troubleshooting of data pipelinesInformatica IDMC Cloud Data IntegrationSalesforce SFDC integrationInformatica Salesforce Data Connector configurationInformatica PowerCenter IDMC

Hyderabad, India

Skills:

SqlEtlOracle SqlIbm DatastageUnix ScriptingAws S3IBM CP4D

Remote

Skills:

JavaJava 8J2EEMs SqlSql