The field of neural learning is moving towards more biologically inspired and energy efficient models. Researchers are drawing inspiration from the brain's ability to learn and adapt continuously, while consuming remarkably low amounts of energy. This has led to the development of new frameworks and mechanisms, such as the use of time-based processes and neuromodulatory signals, to improve learning efficiency and adaptation in artificial neural systems. Noteworthy papers in this area include the work on Structured Knowledge Accumulation, which introduces a continuous and self-organizing learning model based on the principle of entropic least action. The paper on Continual Learning of Multiple Cognitive Functions with Brain-inspired Temporal Development Mechanism proposes a method for sequential evolution of long-range connections between cognitive modules, promoting positive knowledge transfer and reducing energy consumption. The Watts-Per-Intelligence framework provides a mathematical foundation for quantifying energy efficiency in intelligent systems, linking energy consumption to information processing capacity. Finally, the development of Spiking Neural Networks, such as PETNet and the Calcium-based Hebbian Rule, shows great promise for low-energy and biologically plausible alternatives to conventional artificial neural networks.
Biologically Inspired Advances in Neural Learning and Energy Efficiency
Sources
Structured Knowledge Accumulation: The Principle of Entropic Least Action in Forward-Only Neural Learning
Three-Factor Learning in Spiking Neural Networks: An Overview of Methods and Trends from a Machine Learning Perspective