High-Dimensional Data and Complex System Dynamics

Report on Current Developments in the Research Area

General Direction of the Field

The recent advancements in the research area are characterized by a significant shift towards leveraging high-dimensional data and complex system dynamics for improved estimation, control, and model discovery. The field is witnessing a convergence of traditional methods with modern computational techniques, particularly in the context of machine learning and data-driven approaches. This trend is evident in several key areas:

  1. Optimal State Estimation and Filtering: There is a growing interest in extending classical filtering techniques, such as the Kalman filter, to handle infinite-dimensional measurements and complex noise models. This development is crucial for systems equipped with modern sensing modalities like vision and lidar, which provide high-dimensional data. The focus is on deriving optimal linear filters that can handle such high-dimensionality while maintaining stability and accuracy.

  2. Emergence and Ergodicity in State Space Models: The concept of emergence, particularly in large-scale models like language models, is being explored in simpler theoretical frameworks. Researchers are investigating the critical thresholds in parameter space that lead to stable predictions and bounded error in learning tasks. This work highlights the importance of long-range correlations and the necessity of a certain number of parameters for achieving reliable predictions.

  3. Machine Learning and Dynamical Systems: The intersection of machine learning with dynamical systems is gaining traction, particularly in the context of model discovery and system identification. Techniques like Kolmogorov-Arnold networks are being employed to overcome the limitations of sparse optimization in discovering governing equations for complex systems. This approach allows for the discovery of multiple approximate models that capture the essential dynamics of the system.

  4. Symbolic Transfer Entropy and Causal Inference: The optimization of symbolic transfer entropy, a robust method for estimating causal relationships, is being refined to handle complex data and high embedding dimensions. This work is crucial for establishing reliable causal measures in non-stationary environments, which is increasingly relevant in diverse fields like neuroscience and economics.

  5. Data-Driven Representation of Nonlinear Systems: The use of Koopman operator theory and Willems' fundamental lemma is being extended to nonlinear systems that admit a Koopman linear embedding. This approach eliminates the need for lifting functions and provides a data-driven representation of nonlinear systems, emphasizing the importance of both the breadth and depth of trajectory data for accurate modeling.

Noteworthy Papers

  • Optimal Linear Filtering for Discrete-Time Systems with Infinite-Dimensional Measurements: This paper introduces a novel linear filter for high-dimensional measurements, providing explicit derivations and stability conditions.

  • State space models, emergence, and ergodicity: How many parameters are needed for stable predictions?: The study reveals a critical threshold in parameter space for stable predictions in linear dynamical systems, akin to emergence in large models.

  • Machine Learning Toric Duality in Brane Tilings: The application of machine learning to Seiberg duality in quantum field theories achieves remarkably accurate results, demonstrating the potential of ML in complex theoretical physics.

  • Data-driven model discovery with Kolmogorov-Arnold networks: This work presents a general framework for model discovery in complex systems, highlighting the non-uniqueness of approximate models that capture system dynamics.

  • Willems' Fundamental Lemma for Nonlinear Systems with Koopman Linear Embedding: The extension of Willems' lemma to nonlinear systems with Koopman embedding provides a robust data-driven representation without the need for lifting functions.

Sources

Optimal Linear Filtering for Discrete-Time Systems with Infinite-Dimensional Measurements

State space models, emergence, and ergodicity: How many parameters are needed for stable predictions?

Some Thoughts on Symbolic Transfer Entropy

Modeling a demographic problem using the Leslie matrix

Optimal state estimation: Turnpike analysis and performance results

Machine Learning Toric Duality in Brane Tilings

Data-driven model discovery with Kolmogorov-Arnold networks

Learning Linear Dynamics from Bilinear Observations

Willems' Fundamental Lemma for Nonlinear Systems with Koopman Linear Embedding

Built with on top of