Search by job, company or skills

ValueLabs

Snap logic Developer

4-6 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted 9 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Note: Looking for Immediate Joiners

Job Title: Data Engineering Specialist (Streaming & Integration)

Department: Data & Analytics

Reports To: Data Engineering Manager

Location: Remote (Global)

Employment Type: Full-Time

Overview:

We are seeking a highly skilled Data Engineering Specialist with deep expertise in real-time data streaming and integration platforms to join our growing data team. The ideal candidate will have hands-on experience with SnapLogic and Confluent Kafka, and will be responsible for designing, building, and maintaining robust, scalable data pipelines that enable real-time analytics, operational intelligence, and seamless integration across enterprise systems.

Key Responsibilities:

  • Design, develop, and maintain high-throughput, low-latency data pipelines using SnapLogic and Confluent Kafka.
  • Architect and implement event-driven systems using Kafka for real-time data ingestion, processing, and distribution across microservices and downstream analytics platforms.
  • Configure and manage SnapLogic integration workflows for secure, reliable, and automated data movement between SaaS, on-premise, and cloud applications.
  • Collaborate with data scientists, analysts, and application teams to understand data needs and deliver scalable integration solutions.
  • Optimize Kafka cluster performance, monitor stream health, and ensure data durability, consistency, and fault tolerance.
  • Implement data quality checks, schema evolution strategies, and observability using tools like Confluent Control Center, Grafana, and Prometheus.
  • Ensure security and compliance in data flows through encryption, access control, and audit logging.
  • Participate in agile ceremonies and contribute to technical documentation, release planning, and CI/CD practices.
  • Stay current with evolving trends in streaming data, integration platforms, and cloud-native data architectures.

Required Qualifications:

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
  • 4+ years of professional experience in data engineering with a focus on streaming data and integration.
  • Proven experience with Confluent Kafka: building producers/consumers, managing topics, handling partitioning, replication, and stream processing using Kafka Streams or KSQL.
  • Extensive hands-on experience with SnapLogic, including building, testing, and deploying integrations using the SnapLogic Integration Cloud.
  • Strong understanding of data modeling, ETL/ELT processes, and data pipeline orchestration.
  • Experience with cloud platforms (AWS, Azure, or GCP) and containerized environments (Docker, Kubernetes).
  • Proficiency in scripting languages (Python, Bash) and familiarity with infrastructure as code (Terraform, CloudFormation).
  • Knowledge of data security, governance, and compliance standards (e.g., GDPR, SOC 2).
  • Excellent communication skills and ability to work in a collaborative, remote-first environment.

Note: Looking for Immediate Joiners

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 136393325