|
пре 2 година | |
---|---|---|
inference | пре 3 година | |
lib | пре 3 година | |
.gitignore | пре 3 година | |
LICENSE | пре 2 година | |
README.md | пре 3 година | |
arguments.py | пре 3 година | |
callback.py | пре 3 година | |
data.py | пре 3 година | |
huggingface_auth.py | пре 3 година | |
manage_scaleset.py | пре 3 година | |
requirements.txt | пре 3 година | |
run_aux_peer.py | пре 3 година | |
run_trainer.py | пре 3 година | |
run_trainer_tpu.py | пре 3 година | |
task.py | пре 3 година | |
utils.py | пре 3 година |
This repository is a part of the NeurIPS 2021 demonstration "Training Transformers Together".
In this demo, we train a model similar to OpenAI DALL-E — a Transformer "language model" that generates images from text descriptions. Training happens collaboratively — volunteers from all over the Internet contribute to the training using hardware available to them. We use LAION-400M, the world's largest openly available image-text-pair dataset with 400 million samples. Our model is based on the dalle‑pytorch implementation by Phil Wang with a few tweaks to make it communication-efficient.
See details about how to join and how it works on our website.