Post
142
In the spirit of "just making shit"—because, frankly, you can these days—I decided to get back to basics. I built a lightweight RL MLP powered by WebGPU that runs directly in your browser.
The twist? I replaced the standard MLP with a Continued Fractions network. The real win here is interpretability; by applying a Taylor expansion to the continued fractions, we can actually decompose the output and see exactly which features influenced the outcome. It effectively replaces the usual "it’s just black-box magic" with actual visibility into the logic.
It was a fun experiment. Feel free to clone the repo and make it your own!
cahlen/neuron-runner
https://github.com/cahlen/neuron-runner
CoFrGeNet: Continued Fraction Architectures for Language Generation (2601.21766)
The twist? I replaced the standard MLP with a Continued Fractions network. The real win here is interpretability; by applying a Taylor expansion to the continued fractions, we can actually decompose the output and see exactly which features influenced the outcome. It effectively replaces the usual "it’s just black-box magic" with actual visibility into the logic.
It was a fun experiment. Feel free to clone the repo and make it your own!
cahlen/neuron-runner
https://github.com/cahlen/neuron-runner
CoFrGeNet: Continued Fraction Architectures for Language Generation (2601.21766)