The recent developments in the research area have seen a significant shift towards leveraging advanced mathematical and computational techniques to address complex problems across various domains. A notable trend is the application of randomized and low-rank tensor methods, which are proving to be highly effective in speeding up computation processes and improving the efficiency of data representation. These methods are being adapted to novel contexts, such as face recognition and multi-dimensional Markov models, demonstrating their versatility and potential for broader application. Additionally, there is a growing interest in implicit neural representations (INRs), which offer a flexible and efficient way to model continuous data, particularly in imaging and 3D reconstruction tasks. The field is also witnessing advancements in subspace-constrained matrix factorization and tensor tomography, which are enhancing the ability to extract meaningful insights from complex datasets. Notably, these innovations are not only advancing theoretical understanding but also demonstrating practical benefits in real-world applications, such as climber performance modeling and high-dimensional oscillatory integral operators. The integration of these cutting-edge techniques is paving the way for more sophisticated and scalable solutions in data science and beyond.