Research & Papers

FLARE-BO: Fused Luminance and Adaptive Retinex Enhancement via Bayesian Optimisation for Low-Light Robotic Vision

Training-free AI enhances robotic sight in darkness with 8 optimized parameters

Deep Dive

Researchers from the University of Manchester (Nathan Shankar, Pawel Ladosz, Hujun Yin) have released FLARE-BO, a training-free framework that uses Bayesian optimization with Gaussian Processes to enhance low-light robotic vision. Unlike previous methods limited to 3 parameters, FLARE-BO jointly optimizes 8 parameters spanning gamma correction, LIME-style illumination normalization, chrominance denoising, bilateral filtering, NLM denoising, Grey-World automatic white balance, and adaptive post smoothing. This approach avoids the need for large training datasets or learned models, making it adaptable to diverse environments.

FLARE-BO employs a unit hypercube parameter normalization, objective standardization, Sobol quasi-random initialization, and Log Expected Improvement acquisition for principled exploration. Benchmarked on the Low Light paired dataset (LOL), it showed marked improvements over existing methods not specifically trained on this dataset. The framework addresses key limitations of prior work, such as over-smoothing edges and lack of white balance correction, enabling robots to navigate, inspect, and operate reliably in low-light conditions without pre-training.

Key Points
  • FLARE-BO optimizes 8 parameters (gamma, illumination, denoising, white balance) vs. 3 in prior methods
  • Training-free approach uses Bayesian optimization with Gaussian Processes, Sobol initialization, and Log Expected Improvement
  • Outperforms existing methods on the LOL dataset without requiring dataset-specific training

Why It Matters

Enables autonomous robots to see clearly in the dark without costly training data.