Skip to content

A simple PyTorch implementation of Population Based Training of Neural Networks.

Notifications You must be signed in to change notification settings

besterma/PopulationBasedTraining

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PBT: Population Based Training

Population Based Training of Neural Networks, Jaderberg et al. @ DeepMind

A simple PyTorch implementation of PBT.

What this code is for

Finding a good hyperparameter schedule.

How does PBT work?

PBT trains each model partially and assesses them on the validation set. It then transfers the parameters and hyperparameters from the top performing models to the bottom performing models (exploitation). After transferring the hyperparameters, PBT perturbs them (exploration). Each model is then trained some more, and the process repeats. This allows PBT to learn a hyperparameter schedule instead of only a fixed hyperparameter configuration. PBT can be used with different selection methods (e.g. different ways of defining "top" and "bottom" (e.g. top 5, top 5%, etc.)).

For more information, see the paper or blog post.

Requirements

  • PyTorch >= 1.0.0

Usage

$ python main.py --device cuda --population_size 10 --batch_size 20

About

A simple PyTorch implementation of Population Based Training of Neural Networks.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%