Skip to content

JAX-based modification of NEAT algorithm that uses backpropagation to optimize the weights of each network in the population in parallel.

Notifications You must be signed in to change notification settings

CaltropHungerton/gradneat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 

Repository files navigation

gradneat

This is a JAX-based modification of the classic NEAT algorithm that uses backpropagation to optimize the weights of each network in the population in parallel.

Using this, we can quickly evolve minimal neural networks that have strong inductive biases for specific tasks. I showcased the performance on classification tasks including XOR, Circle, and Spiral.

xor_combo spiral_combo circle_combo

About

JAX-based modification of NEAT algorithm that uses backpropagation to optimize the weights of each network in the population in parallel.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages