Research & Papers

Analog Optical Inference on Million-Record Mortgage Data

Analog optical computing proves its real-world mettle, matching digital models on massive financial data.

Deep Dive

A team of researchers led by Sofia Berloff has published a landmark study demonstrating the practical potential of analog optical computers (AOCs) for large-scale, real-world AI. Moving beyond small image datasets, they tested a digital simulation of an AOC on a massive financial task: classifying mortgage approvals using 5.84 million records from U.S. HMDA data. The optical system, using only 1,024 optical parameters within a 5,126-parameter model, achieved a balanced accuracy of 94.6%. This performance came remarkably close to a state-of-the-art digital model, XGBoost, which scored 97.9%. Critically, the study found that seven calibrated hardware non-idealities in the optical system imposed no measurable accuracy penalty, a promising sign for physical implementation.

The research provides a crucial roadmap for the field by systematically isolating three layers where accuracy is lost. The largest cost (5-8 percentage points) comes from encoding continuous data into a 127-bit binary format suitable for the optical core, a constraint shared with potential digital accelerators. The second, an architectural limitation, was revealed when widening the optical core from 16 to 48 channels yielded only a 0.5 percentage point gain. The final layer, hardware fidelity, was shown to be a non-issue in this controlled experiment. This structured analysis precisely identifies 'encoding' as the primary bottleneck to tackle next for closing the performance gap with digital electronics, guiding future research toward more efficient optical data representations.

Key Points
  • The analog optical computer (AOC) digital twin achieved 94.6% accuracy on 5.84 million mortgage records, just 3.3 percentage points behind XGBoost (97.9%).
  • Hardware imperfections showed 'no measurable penalty,' but a 127-bit binary encoding cost 5-8 percentage points in accuracy across all models.
  • The study identifies a three-layer limitation stack (encoding, architecture, hardware) to guide future optical AI development toward real-world applications.

Why It Matters

This validates optical computing for massive, critical datasets, paving the way for ultra-low-power AI inference in finance and beyond.