The recent advancements in graph-based machine learning and optimization techniques are significantly pushing the boundaries of distributed systems and combinatorial problems. A notable trend is the integration of graph neural networks (GNNs) with novel architectures and training paradigms, enabling fully distributed online training and enhancing adaptability in large-scale networked systems. These innovations are not only improving the efficiency of GNNs but also broadening their applicability across various domains such as wireless networks, power grids, and transportation systems. Additionally, there is a growing focus on energy efficiency, with hybrid spiking neural networks being proposed for scientific machine learning tasks, particularly in regression problems related to computational mechanics. These hybrid models leverage sparse communication to reduce energy consumption, making them suitable for edge computing scenarios. Furthermore, advancements in motif mining and topological analysis are providing new tools for efficient power system analysis, addressing computational challenges through AI-powered methods. Overall, the field is witnessing a convergence of machine learning, optimization, and graph theory to tackle complex, real-world problems with greater efficiency and scalability.