Discrete structure embeddings in continuous space are glitchy

Not only is most of machine learning interpolative and therefore limited in generalization, but it is often not well suited for manipulating discrete structures. Just like a Fourier series approximates a square signal with glitches caused (i.e. Gibbs phenomenon), neural networks are brittle when representing discrete discontinuous structures. However, one could argue that the brain supports a continuous mental and phenomenal space, which sometimes approximates discrete systems.

Resources

Backlinks