Attention is representational resource allocation

Attention can seen as a process of representational resource allocation. In this view, attention is the enactive process of representing using varying levels of detail. Attended stimuli might have their structure better preserved, more of their variance would be explained. This model is compatible with the attention formalisms used in ML, such as self-attention in transformers.

Resources

Backlinks