Search by job, company or skills

CG-VAK Software & Exports Ltd.

Lead Engineer, Data & Integration

new job description bg glownew job description bg glownew job description bg svg
  • Posted 10 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Role Overview

We are seeking an experienced Data Engineer with strong expertise in cloud data warehousing, iPaaS integrations, and modern ETL/ELT pipeline development. The ideal candidate will design scalable data platforms, build high-quality data pipelines, and ensure data governance, quality, and security across the organization.

Key Responsibilities

  • Design, develop, and maintain ETL/ELT pipelines for data ingestion, transformation, and integration across multiple systems.
  • Work with Snowflake or similar cloud data warehouse platforms to design schemas, optimize performance, and manage data workflows.
  • Utilize SnapLogic or similar iPaaS tools to integrate applications, APIs, and data sources.
  • Build and orchestrate pipelines using Snowpipe, Streams, Tasks, and other Snowflake automation features.
  • Develop high-performance SQL using Snowflake SQL for transformations, modeling, and validation.
  • Implement data modeling, schema design, metadata management, and scalable warehouse structures.
  • Integrate with APIs using REST, JSON/XML processing, and manage API-based ingestion.
  • Ensure end-to-end source-to-target mapping (STTM), data quality checks, validation, and data governance compliance.
  • Apply data security, access control, and encryption standards across data systems.
  • Build workflow orchestration and automation for pipelines using cloud data tools.
  • Leverage Python for data transformation, automation, and integration tasks.
  • Manage DevOps for data, including CI/CD pipelines, version control, and environment management.
  • Implement logging, monitoring, error handling, and recovery strategies for pipelines and integrations.
  • Collaborate with cross-functional teams to support analytics, reporting, and business use cases.

Required Skills & Experience

  • Strong experience in Snowflake or similar cloud data platform (BigQuery, Redshift, Databricks).
  • Hands-on experience with iPaaS solutions such as SnapLogic, Boomi, Mulesoft, or Informatica Cloud.
  • Expertise in ETL/ELT pipelines, data integration, and cloud-native pipeline orchestration.
  • Advanced proficiency in SQL and Snowflake SQL.
  • Experience with Snowpipe, Streams, Tasks, and other Snowflake automation components.
  • Strong knowledge of data modeling, schema design, metadata management, and data lifecycle management.
  • Experience with API integrations, REST APIs, JSON/XML, and automation frameworks.
  • Proficiency in Python for scripting and data workflows.
  • Understanding of data governance, data quality, data validation, and security standards.
  • Experience with DevOps for data, including CI/CD, code repositories, testing automation, and workflow automation.
  • Strong debugging, performance tuning, and error-handling skills.

Preferred Qualifications

  • Experience with cloud platforms (AWS, Azure, or GCP).
  • Familiarity with orchestration tools such as Airflow, Prefect, Dagster, or Control-M.
  • Knowledge of integration patterns and microservices-based architecture.

Skills: data governance,data,integration,management,sql,automation,snowflake

More Info

Job Type:
Industry:
Employment Type:

Job ID: 135074595

Similar Jobs