The field of neural networks and cellular automata is experiencing significant growth, with a focus on developing innovative training methods and exploring the properties of these complex systems. Researchers are investigating the effects of internal noise on the performance of neural networks, with findings suggesting that noise can enhance resilience and improve testing performance. Additionally, new training algorithms such as the Error Diffusion Learning Algorithm and the Dual-Individual Genetic Algorithm are being proposed, which offer improved efficiency and accuracy. The study of cellular automata is also advancing, with research on structural properties such as permutivity, surjectivity, and reversibility, as well as the development of coalgebraic modal logics to reason about their behavior. Noteworthy papers include the introduction of a hybrid prediction error feedback mechanism for deep predictive coding networks, which achieves faster convergence and higher predictive accuracy, and a neuro-evolutionary approach to physics-aware symbolic regression, which combines the strengths of evolutionary-based search and gradient-based tuning. Furthermore, a systematic study on the design of odd-sized highly nonlinear Boolean functions via evolutionary algorithms has shown promising results, with genetic programming outperforming other evolutionary algorithms. Overall, these developments are pushing the boundaries of what is possible in neural networks and cellular automata, and are expected to have a significant impact on the field.