Thingtrax is building an Agentic Manufacturing Operations platform to fundamentally change how factories run. By deploying AI-powered agents including computer vision systems directly on production lines, we enable manufacturing operations that can observe, reason, and act in real time. Partnering closely with manufacturers, especially in food and beverage, we're helping teams reduce waste, improve quality, and move from manual intervention to autonomous, continuously optimised operations.
We are establishing a global R&D centre with a state-of-the-art applied AI laboratory equipped with NVIDIA Jetson/RTX platforms, industrial global-shutter cameras, precision optics, high-power strobed lighting systems, and real production hardware environments.
This role is deeply hands-on and focused on low-level camera integration, real-time image acquisition, and hardware-software synchronization for edge AI systems.
Role Overview
We are looking for an engineer who understands how cameras actually work at the electrical, driver, and kernel level not just OpenCV.
You will own the integration of industrial cameras (MIPI CSI-2, GigE Vision, USB3 Vision), sensor bring-up, strobe synchronization, trigger logic, and real-time data pipelines on NVIDIA edge platforms.
This is not general PCB design it is low-level vision system engineering.
Key Responsibilities
Camera & Interface Integration
- Integrate and optimize MIPI CSI-2, USB3 Vision, and GigE Vision cameras
- Bring up image sensors on NVIDIA Jetson platforms
- Work with V4L2, media controller framework, and device tree configurations
- Develop and modify Linux kernel drivers for camera subsystems
- Debug CSI lane configuration, I2C sensor control, and bandwidth bottlenecks
- Implement hardware trigger and strobe synchronization (external trigger, GPIO, PWM, sync generators)
Real-Time Vision Hardware Optimization
- Optimize image acquisition pipelines for low latency and high frame rates (120-200 FPS)
- Handle DMA, buffer management, and zero-copy transfers to GPU
- Ensure deterministic timing for strobed lighting and exposure control
- Profile and reduce end-to-end capture-to-inference latency
System-Level Engineering
- Integrate cameras, lenses, IR lighting, and processing units into production-ready systems
- Work with signal integrity and high-speed interface considerations (CSI, USB, Ethernet)
- Conduct performance validation under production conditions
- Diagnose hardware/firmware issues in live industrial environments
Collaboration & Productionisation
- Work closely with CUDA / AI engineers to optimize GPU pipelines
- Support deployment on edge devices (Jetson Orin, Nano, etc.)
- Coordinate with suppliers on camera modules, sensors, and sync hardware
- Document hardware-software integration architecture
Requirements
- Strong experience with Linux kernel development
- Deep understanding of V4L2 and camera driver stack
- Experience working with MIPI CSI-2 camera bring-up
- Knowledge of I2C sensor configuration and register-level debugging
- Experience with hardware trigger and strobe synchronization
- Familiarity with GigE Vision and USB3 Vision protocols
- Experience with NVIDIA Jetson platforms
- Understanding of image sensor fundamentals (global shutter, rolling shutter, exposure timing)
- Strong debugging skills using oscilloscopes, logic analyzers, and protocol analyzers
- Bonus (Highly Valuable)
- Experience with FPGA-based image pipelines
- CUDA memory optimization knowledge
- Experience with industrial strobed IR lighting systems
- Knowledge of real-time systems and deterministic timing design
- Experience in manufacturing or machine vision environments
Benefits
- Private Health Insurance
- Paid Time Off
- Work From Home
- Training & Development