Skip to content
/ disco Public

DISCO is a code-free and installation-free browser platform that allows any non-technical user to collaboratively train machine learning models without sharing any private data.

License

Notifications You must be signed in to change notification settings

epfml/disco

 
 

Repository files navigation

DISCO - DIStributed COllaborative Machine Learning

DISCO leverages federated 🌟 and decentralized ✨ learning to allow several data owners to collaboratively build machine learning models without sharing any original data.

The latest version is always running on the following link, directly in your browser, for web and mobile:

πŸ•Ί https://discolab.ai/ πŸ•Ί


πŸͺ„ DEVELOPERS: DISCO is written fully in JavaScript/TypeScript. Have a look at our developer guide.


❓ WHY DISCO?

  • To build deep learning models across private datasets without compromising data privacy, ownership, sovereignty, or model performance
  • To create an easy-to-use platform that allows non-specialists to participate in collaborative learning

βš™οΈ HOW DISCO WORKS

  • DISCO has a public model – private data approach
  • Private and secure model updates – not data – are communicated to either:
    • a central server : federated learning ( 🌟 )
    • directly between users : decentralized learning ( ✨ ) i.e. no central coordination
  • Model updates are then securely aggregated into a trained model
  • See more HERE

❓ DISCO TECHNOLOGY

  • DISCO runs arbitrary deep learning tasks and model architectures in your browser, via TF.js
  • Decentralized learning ✨ relies on peer2peer communication
  • Have a look at how DISCO ensures privacy and confidentiality HERE

πŸ§ͺ RESEARCH-BASED DESIGN

DISCO leverages latest research advances, enabling open-access and easy-use distributed training which is

  • πŸ”’ privacy-preserving (R1)
  • πŸ› οΈ dynamic and asynchronous over time (R2, R7)
  • πŸ₯· robust to malicious actors (R3 (partially))

And more on the roadmap

  • πŸŒͺ️ efficient (R4, R5)
  • πŸ”’ privacy-preserving while Byzantine robust (R6)
  • πŸ₯· resistant to data poisoning (R8)
  • 🍎 🍌 interpretable in imperfectly interoperable data distributions (R9)
  • πŸͺž personalizable (R10)
  • πŸ₯• fairly incentivizing participation

🏁 HOW TO USE DISCO

  • Start by exploring our examples tasks in the DISCOllaboratives page.
  • The example DISCOllaboratives are based on popular ML tasks such as GPT2, Titanic, MNIST or CIFAR-10
  • It is also possible to create your own DISCOllaboratives without coding on the custom training page:
    • Upload the initial model
    • Choose between federated and decentralized for your DISCO training scheme ... connect your data and... done! πŸ“Š
    • For more details on ML tasks and custom training have a look at this guide