Browse Source

Add link to MoE tutorial to readme (#329)

Previously, users interested in DMoE were unable to find any instructions in the repository.
Alexander Borzunov 4 years ago
parent
commit
407c201941
1 changed files with 2 additions and 0 deletions
  1. 2 0
      README.md

+ 2 - 0
README.md

@@ -78,6 +78,8 @@ of [Go toolchain](https://golang.org/doc/install) (1.15 or higher).
   installation and a training a simple neural network with several peers.
   installation and a training a simple neural network with several peers.
 * [examples/albert](https://github.com/learning-at-home/hivemind/tree/master/examples/albert) contains the starter kit
 * [examples/albert](https://github.com/learning-at-home/hivemind/tree/master/examples/albert) contains the starter kit
   and instructions for training a Transformer masked language model collaboratively.
   and instructions for training a Transformer masked language model collaboratively.
+* The [Mixture-of-Experts tutorial](https://learning-at-home.readthedocs.io/en/latest/user/moe.html)
+  covers the usage of Decentralized Mixture-of-Experts layers.
 * API reference and additional tutorials are available
 * API reference and additional tutorials are available
   at [learning-at-home.readthedocs.io](https://learning-at-home.readthedocs.io)
   at [learning-at-home.readthedocs.io](https://learning-at-home.readthedocs.io)