Formalizing Intelligence: Optimization and Compositionality

The recent developments in the research area of natural and artificial intelligence have seen a significant shift towards formalizing and optimizing the principles governing intelligence. Researchers are increasingly focusing on constrained optimization frameworks to model the evolution of natural intelligence, emphasizing the maximization of information entropy under resource limitations. This approach not only provides a theoretical foundation for understanding the complexity and efficiency of learning models but also highlights the effectiveness of neural networks as collaborative units. Additionally, there is a growing emphasis on defining and measuring compositionality in intelligence, with a novel complexity-based theory offering a formal, measurable definition that bridges cognitive science and AI. This theory, grounded in algorithmic information theory, aims to inspire new models that better capture compositional thought. Notably, advancements in large language models (LLMs) demonstrate promising capabilities in compositional generalization, though they still fall short in fully mimicking human-like distributions in certain contexts.

Sources

"Efficient Complexity": a Constrained Optimization Approach to the Evolution of Natural Intelligence

The Representation of Meaningful Precision, and Accuracy

A Complexity-Based Theory of Compositionality

Is artificial intelligence still intelligence? LLMs generalize to novel adjective-noun pairs, but don't mimic the full human distribution

Built with on top of