Report on Current Developments in Graph Neural Networks (GNNs)
General Direction of the Field
The field of Graph Neural Networks (GNNs) is currently witnessing a shift towards more nuanced and computationally efficient models of expressivity. Researchers are increasingly focusing on the limitations and practical implications of uniform expressivity, particularly in the context of large-scale graph data. The traditional focus on uniform expressivity, which ensures that GNNs can express queries without parameter dependency on the size of input graphs, is being re-evaluated. Recent studies suggest that while uniform expressivity is theoretically desirable, it may be overly restrictive in practical applications. Instead, there is a growing interest in developing GNNs that can efficiently express complex queries with a logarithmic dependency on the size of input graphs, thereby making them more scalable and applicable to real-world scenarios.
Another significant trend is the rethinking of GNN expressiveness from a computational model perspective. Traditional analyses often relied on comparisons with the Weisfeiler-Lehman (WL) test or classical graph algorithms, but these approaches have been critiqued for overlooking computational costs and making unrealistic assumptions about resources. There is a pressing need for a well-defined computational model that can serve as a foundation for assessing GNN expressiveness more accurately. This has led to the introduction of new models, such as the Resource-Limited CONGEST (RL-CONGEST) model, which incorporate optional preprocessing and postprocessing to better reflect practical computational constraints.
Dynamic Graph Neural Networks (DyGNNs) are also gaining attention, particularly for their ability to handle evolving graphs. However, the limited expressive power of existing DyGNNs has been a significant barrier. Recent advancements have focused on developing DyGNNs with provable high-order expressive power, which can capture important evolving patterns in dynamic graphs. This has led to the proposal of new frameworks, such as the k-dimensional Dynamic WL tests (k-DWL), which serve as a basis for quantifying the expressive power of DyGNNs.
Additionally, there is a growing interest in fine-grained measures of GNN expressiveness, particularly from a homomorphism counting perspective. This approach provides a more detailed understanding of how GNNs can capture complex graph structures and relationships. The introduction of generalized folklore Weisfeiler-Leman (GFWL) algorithms offers a flexible design basis for expressive GNNs, enabling a more systematic analysis of their homomorphism counting power.
Finally, the expressive power of tree-structured probabilistic circuits (PCs) is being explored in depth. While PCs with general directed acyclic graph (DAG) structures have been well-studied, there is a renewed interest in understanding the limitations and capabilities of tree-structured PCs. Recent work has shown that while there is a sub-exponential upper bound on the size of equivalent tree-structured PCs, there is also a super-polynomial separation between tree and DAG-structured PCs under certain depth restrictions.
Noteworthy Papers
"Is uniform expressivity too restrictive? Towards efficient expressivity of graph neural networks."
This paper challenges the conventional view of uniform expressivity, proposing that a logarithmic dependency on input graph size is more practical."Rethinking the Expressiveness of GNNs: A Computational Model Perspective."
Introduces the RL-CONGEST model to better reflect practical computational constraints in assessing GNN expressiveness."Towards Dynamic Graph Neural Networks with Provably High-Order Expressive Power."
Proposes a new framework, HopeDGN, with provable high-order expressive power for dynamic graphs."Fine-Grained Expressive Power of Weisfeiler-Leman: A Homomorphism Counting Perspective."
Offers a unified framework for analyzing GNN expressiveness through homomorphism counting."On the Expressive Power of Tree-Structured Probabilistic Circuits."
Provides new insights into the limitations and capabilities of tree-structured probabilistic circuits.