The field of neural networks and optical computing is rapidly evolving, with a focus on developing innovative hardware and software solutions to improve efficiency and scalability. Recent developments have centered around the creation of autonomous optical neural networks (ONNs) that can learn and adapt without relying on traditional von Neumann computers. This has led to the exploration of new optimization algorithms and techniques that can fully exploit the potential of ONNs. Additionally, there is a growing interest in Bayesian networks and neural operators, with a range of software packages and tools being developed to support their implementation. Notably, researchers are working to close the theory-to-practice gap in neural network learning, with a focus on understanding the sampling complexity and convergence rates of different algorithms. Noteworthy papers include:
- A paper on model-free front-to-end training of a large high performance laser neural network, which demonstrates a fully autonomous and parallel ONN using a multimode vertical cavity surface emitting laser.
- A paper introducing HyperNOs, a PyTorch library designed to streamline and automate the process of exploring neural operators, which achieves state-of-the-art results on many representative benchmarks.
- A paper studying the theory-to-practice gap for neural networks and neural operators, which derives upper bounds on the best-possible convergence rate of any learning algorithm and extends the theory-to-practice gap to the infinite-dimensional setting of operator learning.