Job Description
Technology->BPMI - Opensource->Talend Integration Suite->Talend - Talend Integration Studio,Technology->Cloud Platform->Amazon Webservices DevOps,Technology->Data on Cloud-DataStore->Snowflake
Key Responsibilities: Data Integration & Pipelines Design, develop, and maintain Talend ETL/ELT jobs to ingest and transform data from multiple sources into Snowflake. Implement incremental loads, CDC patterns, and scheduling/orchestration to ensure timely and reliable data delivery. Snowflake Development Develop and optimize Snowflake objects (schemas, tables, views) and write efficient SQL for transformations and reporting needs. Monitor and tune performance using clustering, warehouse sizing, and query optimization best practices. AWS Enablement Integrate AWS services (e.g., S3 for landing zones) with Talend and Snowflake for secure data movement and storage. Support environment configuration, access controls, and operational monitoring aligned with cloud best practices. Quality, Operations & Collaboration Build data quality checks, validations, and reconciliation processes to ensure accuracy and completeness. Troubleshoot pipeline failures, perform root-cause analysis, and implement preventive fixes and documentation. Collaborate in a hybrid setup with cross-functional teams to gather requirements, estimate work, and deliver iteratively.
Experience designing ELT patterns in Snowflake and optimizing compute/storage usage for cost and performance. Hands-on experience with AWS data ecosystem integrations and secure data handling practices. Familiarity with CI/CD practices for data pipelines and maintaining reusable Talend components/frameworks. Experience working with structured and semi-structured data and implementing robust data validation strategies. Proven ability to collaborate effectively in a hybrid model, communicate clearly, and deliver within sprint-based execution. Good to have skills: dbt, Apache Airflow, AWS Glue, Python, Terraform