Glom enables hierarchical representations in continuous spaces

By representing a [[ Tokenization for i+1 language learning is non-trivial|token ]] at multiple levels of abstraction concurrently, from the fine structure to the coarse one, one can derive hierarchical representations. Elements which have a lot in common at coarse levels might be on the same branch, only to break off at a finer level. This approach might enable consequential explanations in perception, memory, and intelligence, if combined with other mechanisms such as content-based memory addressing. GLOM might also be useful in a multi-modal setting, by housing multi-modal percepts of an object at levels below its representation. Additionally, GLOM might be useful for temporal representations.

Resources

Backlinks