Search by job, company or skills

  • Posted 8 hours ago
  • Over 50 applicants
Quick Apply

Job Description

Vayuz Technologies is seeking a highly skilled Data Engineer to design and develop robust data pipelines for real-time and batch data ingestion and processing. This role is crucial for leveraging technologies like Confluent Kafka, ksqlDB, Kafka Connect, and Apache Flink to build scalable and reliable data solutions. You will play a key role in ensuring data accuracy, optimizing pipeline performance, and collaborating with cross-functional teams to drive innovation.

Role Expectations

  • Data Pipeline Development: Design and develop sophisticated data pipelines for real-time and batch data ingestion and processing using Confluent Kafka, ksqlDB, Kafka Connect, and Apache Flink.
  • Kafka Connector Expertise: Build and configure Kafka Connectors to ingest data from various sources (databases, APIs, message queues, etc.) into Kafka.
  • Flink Application Development: Develop Flink applications for complex event processing, stream enrichment, and real-time analytics.
  • ksqlDB Optimization: Develop and optimize ksqlDB queries for real-time data transformations, aggregations, and filtering.
  • Data Quality & Monitoring: Implement robust data quality checks and monitoring to ensure data accuracy and reliability throughout the pipeline.
  • Performance Tuning: Monitor and troubleshoot data pipeline performance, identify bottlenecks, and implement effective optimizations.
  • Automation: Automate data pipeline deployment, monitoring, and maintenance tasks to enhance efficiency.
  • Continuous Learning: Stay up-to-date with the latest advancements in data streaming technologies and best practices.
  • Standard Contributions: Contribute to the development of data engineering standards and best practices within the organization.
  • Team Collaboration: Participate actively in code reviews and contribute to a collaborative and supportive team environment. Work closely with other architects and tech leads in India & US to create POCs and MVPs.
  • Reporting: Provide regular updates on tasks, status, and risks to the project manager.

Qualifications

  • Bachelor's degree or higher from a reputed university.
  • Extensive total experience with a significant focus on ETL/ELT, big data, and Kafka.
  • Proficiency in developing Flink applications for stream processing and real-time analytics.
  • Strong understanding of data streaming concepts and architectures.
  • Extensive experience with Confluent Kafka, including Kafka Brokers, Producers, Consumers, and Schema Registry.
  • Hands-on experience with ksqlDB for real-time data transformations and stream processing.
  • Experience with Kafka Connect and building custom connectors.
  • Extensive experience in implementing large-scale data ingestion and curation solutions.
  • Good hands-on experience in the big data technology stack with any cloud platform.
  • Excellent problem-solving, analytical, and communication skills.
  • Ability to work independently and as part of a team.

More Info

Job Type:
Industry:
Function:
Employment Type:
Open to candidates from:
Indian

About Company

We are an Engineering company with leading capabilities in Digital Transformation, Internet-related services and products, Data Science, Development Operations, Product as a Service, Technology Consulting, and Software Engineering. We are trusted by NSE giants as a digital transformation partner and have a proven track record in providing customized digital solutions across industry segments. To know more visit us at www.vayuz.com VAYUZ DNA: We believe that great products require a product-thinking DNA. Our approach starts with putting the customer first and runs through everything we do—from the development process to the product design methodology, every line of code we write, and the way our team collaborates with clients. This strong foundation allows us to deliver innovative, scalable, and robust technology solutions. Key Mantras: - Respect TIME: We value time and are committed to delivering on our promises. - PLAN Well: Detailed planning is the cornerstone of our high-quality delivery. - Deliver QUALITY: Excellence in every deliverable is non-negotiable. Core Offerings: - Cloud: Migration, infrastructure management, SaaS, PaaS, IaaS. - Data: Analytics, BI tools, predictive analytics, data-driven decisions. - AI/ML: Automation, models, NLP. - Digital Workplace: Remote work, virtual desktops, employee experience. - IoT: Connected devices, analytics, smart infrastructure. - Mobile/Web: Custom apps, PWAs, mobile-first design. - Legacy Modernization: Upgrades, refactoring, integration. - DevOps/Agile: Implementation, methodology for faster delivery.

Job ID: 122413967

Similar Jobs