Search by job, company or skills

I

Talend, Snowflake, AWS

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 15 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Technology->BPMI - Opensource->Talend Integration Suite->Talend - Talend Integration Studio,Technology->Cloud Platform->Amazon Webservices DevOps,Technology->Data on Cloud-DataStore->Snowflake

Key Responsibilities: Data Integration & Pipelines Design, develop, and maintain Talend ETL/ELT jobs to ingest and transform data from multiple sources into Snowflake. Implement incremental loads, CDC patterns, and scheduling/orchestration to ensure timely and reliable data delivery. Snowflake Development Develop and optimize Snowflake objects (schemas, tables, views) and write efficient SQL for transformations and reporting needs. Monitor and tune performance using clustering, warehouse sizing, and query optimization best practices. AWS Enablement Integrate AWS services (e.g., S3 for landing zones) with Talend and Snowflake for secure data movement and storage. Support environment configuration, access controls, and operational monitoring aligned with cloud best practices. Quality, Operations & Collaboration Build data quality checks, validations, and reconciliation processes to ensure accuracy and completeness. Troubleshoot pipeline failures, perform root-cause analysis, and implement preventive fixes and documentation. Collaborate in a hybrid setup with cross-functional teams to gather requirements, estimate work, and deliver iteratively.

Experience designing ELT patterns in Snowflake and optimizing compute/storage usage for cost and performance. Hands-on experience with AWS data ecosystem integrations and secure data handling practices. Familiarity with CI/CD practices for data pipelines and maintaining reusable Talend components/frameworks. Experience working with structured and semi-structured data and implementing robust data validation strategies. Proven ability to collaborate effectively in a hybrid model, communicate clearly, and deliver within sprint-based execution. Good to have skills: dbt, Apache Airflow, AWS Glue, Python, Terraform

More Info

Job Type:
Industry:
Function:
Employment Type:

About Company

Job ID: 147209095

Similar Jobs

Hyderabad, India

Skills:

snowflake SqlEtlELTBPMITalend Integration Suite

Hyderabad, India

Skills:

JavaKafkaNodejsAngularCloudFrontLambdaCloudwatchReactjsSqsECSSnsPythonApi GatewayAWSCodeDeployCodeBuildEKSFargateKinesis StreamsCodePipelineCodeCommitCloud-Native CI CD

Hyderabad, India

Skills:

data vault snowflake Azure CloudPysparkPythonAWSAirflowAzure DevOps EcosystemdbtSemantic LayerPoint In Time

Hyderabad, India

Skills:

NosqlRDBMSPythonSqlAWSEtldata ingestion protocols

Hyderabad, India

Skills:

PostgreSQLMicroservicesKubernetesfrontend and backend automationcloud-native applications on AWSevent-driven architectures using KafkaAngular frontend developmentJava-based API developmentcontainerized applications using DockerAPI development and testing