Advances in Neural Differential Equations and Tensor Decomposition

The field of differential equations and tensor decomposition is experiencing a significant surge in innovation, driven by the intersection of deep learning and traditional numerical methods. Researchers are exploring new ways to improve the efficiency and accuracy of neural differential equations, including the development of scalable adjoint backpropagation methods and hybrid models that combine neural networks with recurrent neural networks. Additionally, tensor decomposition techniques such as canonical polyadic decomposition and interpolative decompositions are being enhanced with novel algorithms that reduce computational complexity and improve fitting accuracy. Noteworthy papers in this area include the introduction of a semigroup-homomorphic signature scheme, which provides a secure and efficient way to construct homomorphic signatures on semigroups. The development of a novel CP decomposition algorithm that significantly improves efficiency is also a highlight. Furthermore, the creation of MixFunn, a neural network architecture designed to solve differential equations with enhanced precision, interpretability, and generalization capability, demonstrates the potential for machine learning models to achieve higher accuracy and improved generalization in physics-informed settings.

Sources

Efficient Training of Neural Fractional-Order Differential Equation via Adjoint Backpropagation

Semigroup-homomorphic Signature

The Akhiezer iteration and an inverse-free solver for Sylvester matrix equations

Efficient QR-Based CP Decomposition Acceleration via Dimension Tree and Extrapolation

On two families of iterative methods without memory

Fast and Accurate Interpolative Decompositions for General, Sparse, and Structured Tensors

Block Gauss-Seidel methods for t-product tensor regression

Sub-ODEs Simplify Taylor Series Algorithms for Ordinary Differential Equations

General form of the Gauss-Seidel equation to linearly approximate the Moore-Penrose pseudoinverse in random non-square systems and high order tensors

Hybrid Time-Domain Behavior Model Based on Neural Differential Equations and RNNs

MixFunn: A Neural Network for Differential Equations with Improved Generalization and Interpretability

Built with on top of