Compiling a Neural Net to C for a 1,744× speedup
14 points by eBPF
14 points by eBPF
very instructive, a bunch of interesting techniques in here. Incidentally, I think they rediscovered the “ResNet” idea with the passthrough gate, which is nowadays everywhere in deep learning.
See also: https://sidsite.com/posts/life-transformer/
I would also like to point out a much more exciting and developed modelling process, whereby neural networks extract the underlying boolean logic from simulation outputs: https://google-research.github.io/self-organising-systems/difflogic-ca
I firmly believe that differentiable logic CA is the winner, in particular because it extracts the logic directly, and thus leads to generalize-able programs as opposed to staying stuck in matrix multiplication land.
This post uses the same technique as the second link, difflogic CA! Though I didn’t implement perception or async, just learned a kernel.
(The second link’s broken, but I think you meant this one: https://google-research.github.io/self-organising-systems/difflogic-ca/)