Perceptron networks pass rational numbers between neurones, but space with a (constant) negative curvature (hyperbolic space) has some completely amazing properties that it’s taken me a while to work out proper intuition for that seem really stunningly useful for them. Higher dimensional complex numbers such as octonions are another option for values whiten a neural network as well as many others.
Generally even with GPT-3’s help the knowledge of how things went with trying using these sort of values in neural networks is tied up in for me, incomprehensible scientific papers. Would anyone be interested in describing the pros, cons, and fun oddities in more like plain English or plain(er) math?