Skip to content

DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.

License

Notifications You must be signed in to change notification settings

VectorInstitute/DeeperSpeed

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

License MIT

DeeperSpeed

DeeperSpeed is a fork of Microsoft's Deepspeed library that is tailor-made for the GPT-NeoX by EleutherAI.

Prior to 3/9/2023, DeeperSpeed was based on an old version of DeepSpeed (0.3.15). In order to migrate to the latest upstream DeepSpeed version while allowing users to access the old versions of GPT-NeoX and DeeperSpeed, we have introduced two versioned releases for both libraries:

About

DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 68.9%
  • C++ 20.6%
  • Cuda 9.6%
  • Shell 0.4%
  • C 0.4%
  • Dockerfile 0.1%