Преглед изворни кода

Add links to "Example Use Cases" (#497)

I think some people are interested in the "Example Use Cases" section because they'd like to know what was already built with hivemind, and other people would like to take a look on the code if they've already started to use hivemind and want some code examples.

Currently, the sahajBERT link leads to the sahajBERT repo that doesn't describe much about the project itself. Conversely, it's hard to find the repo with the code following the CALM and "Training Transformers Together" links.

This PR adds more useful links to each of the projects.
Alexander Borzunov пре 3 година
родитељ
комит
7a7c93aeff
1 измењених фајлова са 5 додато и 5 уклоњено
  1. 5 5
      README.md

+ 5 - 5
README.md

@@ -26,16 +26,16 @@ large model on hundreds of computers from different universities, companies, and
 To learn more about the ideas behind this library,
 see the [full list](https://github.com/learning-at-home/hivemind/tree/refer-to-discord-in-docs#citation) of our papers below.
 
-## Example Applications and Use Cases
+## Example Use Cases
 
 This section lists projects that leverage hivemind for decentralized training. 
 If you have succesfully trained a model or created a downstream repository with the help of our library, 
 feel free to submit a pull request that adds your project to this list.
 
-* [sahajBERT](https://github.com/tanmoyio/sahajbert) — a collaboratively pretrained ALBERT-xlarge for the Bengali language.
-* [CALM](https://github.com/NCAI-Research/CALM) (Collaborative Arabic Language Model) — a masked language model trained on a combination of Arabic datasets.
-* [Training Transformers Together](https://training-transformers-together.github.io/) — a NeurIPS 2021 demonstration that trained a collaborative text-to-image Transformer model.
-* [HivemindStrategy](https://pytorch-lightning.readthedocs.io/en/latest/api/pytorch_lightning.strategies.HivemindStrategy.html) in PyTorch Lightning allows adapting your existing pipelines to training over slow network with unreliable peers.
+* **sahajBERT** ([blog post](https://huggingface.co/blog/collaborative-training), [code](https://github.com/tanmoyio/sahajbert)) — a collaboratively pretrained ALBERT-xlarge for the Bengali language.
+* **CALM** ([webpage](https://huggingface.co/CALM), [code](https://github.com/NCAI-Research/CALM)) — a masked language model trained on a combination of Arabic datasets.
+* **Training Transformers Together** ([webpage](https://training-transformers-together.github.io/), [code](https://github.com/learning-at-home/dalle-hivemind)) — a NeurIPS 2021 demonstration that trained a collaborative text-to-image Transformer model.
+* **HivemindStrategy** ([docs](https://pytorch-lightning.readthedocs.io/en/latest/api/pytorch_lightning.strategies.HivemindStrategy.html)) in PyTorch Lightning allows adapting your existing pipelines to training over slow network with unreliable peers.
 
 ## Installation