Skip to content

Latest commit

 

History

History
9 lines (6 loc) · 718 Bytes

README.md

File metadata and controls

9 lines (6 loc) · 718 Bytes

gradneat

This is a JAX-based modification of the classic NEAT algorithm that uses backpropagation to optimize the weights of each network in the population in parallel.

Using this, we can quickly evolve minimal neural networks that have strong inductive biases for specific tasks. I showcased the performance on classification tasks including XOR, Circle, and Spiral.

xor_combo spiral_combo circle_combo