Žiadny popis

Max Ryabinin 520562af4b Add the contribution guide (#156) 4 rokov pred
.circleci 520562af4b Add the contribution guide (#156) 4 rokov pred
.github 9ba811788c add blank issue 5 rokov pred
docs 520562af4b Add the contribution guide (#156) 4 rokov pred
hivemind 520562af4b Add the contribution guide (#156) 4 rokov pred
scripts d092810322 Load checkpoints on server start (#138) 4 rokov pred
tests 520562af4b Add the contribution guide (#156) 4 rokov pred
.gitignore c43fabcddb Add .gitignore 5 rokov pred
.readthedocs.yml c450a43fd0 Fix flaky test_remote_module_call, extract requirements for docs/tests (#118) 4 rokov pred
CONTRIBUTING.md 520562af4b Add the contribution guide (#156) 4 rokov pred
LICENSE f386fb4d42 Create LICENSE 5 rokov pred
README.md 520562af4b Add the contribution guide (#156) 4 rokov pred
requirements-dev.txt c450a43fd0 Fix flaky test_remote_module_call, extract requirements for docs/tests (#118) 4 rokov pred
requirements-docs.txt c450a43fd0 Fix flaky test_remote_module_call, extract requirements for docs/tests (#118) 4 rokov pred
requirements.txt 8466d722da Add Averager load balancing and public endpoints (#140) 4 rokov pred
setup.py 520562af4b Add the contribution guide (#156) 4 rokov pred

README.md

Hivemind: decentralized deep learning in PyTorch

Build status Documentation Status Gitter

Hivemind is a PyTorch library to train large neural networks across the Internet. Its intended usage is training a single Transformer model on hundreds of computers from different universities, companies, and volunteers.

img

Key Features

  • Train neural networks of arbitrary size: parts of their layers are distributed across the participants.
  • Distributed training without a master node: Distributed Hash Table allows connecting computers in a decentralized network.
  • Fault-tolerant backpropagation: forward and backward passes succeed even if some nodes are unresponsive or take too long to respond.
  • Decentralized parameter averaging: iteratively aggregate updates from multiple workers without the need to synchronize across the entire network.

To learn more about the ideas behind this library, see https://learning-at-home.github.io or read the NeurIPS 2020 paper.

Installation

Before installing hivemind, make sure that your environment has Python 3.8+ and PyTorch with a version at least as new as 1.6.0.

To start using this library, you can either use the pip package manager or build it from source. Since currently the release cycle is not established yet, we recommend installing hivemind from source to keep up with the latest bugfixes and improvements.

With pip

If your versions of Python and PyTorch match the requirements, you can install hivemind from pip:

pip install hivemind

From source

To install hivemind from source, simply clone the repository and install

git clone https://github.com/learning-at-home/hivemind.git
cd hivemind
pip install .

If you would like to verify that your installation is working properly, you can install with pip install -e .[dev] instead. Then, you can run the tests with pytest tests/.

Documentation

Contributing

Hivemind is currently at the active development stage, and we welcome all contributions. Everything, from bug fixes and documentation improvements to entirely new features, is equally appreciated.

If you want to contribute to hivemind but don't know where to start, take a look at the unresolved issues. Open a new issue or join our chat room in case you want to discuss new functionality or report a possible bug. Bug fixes are always welcome, but new features should be preferably discussed with maintainers beforehand.

If you want to start contributing to the source code of hivemind, please see the contributing guidelines first. To learn more about other ways to contribute, read our guide.

Citation

If you found hivemind useful for your experiments, you can cite the paper that inspired it:

@inproceedings{ryabinin2020crowdsourced,
 author = {Ryabinin, Max and Gusev, Anton},
 booktitle = {Advances in Neural Information Processing Systems},
 editor = {H. Larochelle and M. Ranzato and R. Hadsell and M. F. Balcan and H. Lin},
 pages = {3659--3672},
 publisher = {Curran Associates, Inc.},
 title = {Towards Crowdsourced Training of Large Neural Networks using Decentralized Mixture-of-Experts},
 url = {https://proceedings.neurips.cc/paper/2020/file/25ddc0f8c9d3e22e03d3076f98d83cb2-Paper.pdf},
 volume = {33},
 year = {2020}
}

The initial implementation of hivemind used for the paper is available at mryab/learning-at-home.

In the documentation, we list several related projects and acknowledgements.