optim.rst 1.4 KB

1234567891011121314151617181920212223242526272829303132333435363738394041424344
  1. **hivemind.optim**
  2. ==================
  3. .. raw:: html
  4. This module contains decentralized optimizers that wrap regular pytorch optimizers to collaboratively train a shared model. Depending on the exact type, optimizer may average model parameters with peers, exchange gradients, or follow a more complicated distributed training strategy.
  5. <br><br>
  6. .. automodule:: hivemind.optim.experimental.optimizer
  7. .. currentmodule:: hivemind.optim.experimental.optimizer
  8. **hivemind.Optimizer**
  9. ----------------------
  10. .. autoclass:: Optimizer
  11. :members: step, zero_grad, load_state_from_peers, param_groups, shutdown
  12. :member-order: bysource
  13. .. currentmodule:: hivemind.optim.grad_scaler
  14. .. autoclass:: GradScaler
  15. :member-order: bysource
  16. **CollaborativeOptimizer**
  17. --------------------------
  18. .. raw:: html
  19. CollaborativeOptimizer is a legacy version of hivemind.Optimizer. **For new projects, please use hivemind.Optimizer.**
  20. Currently, hivemind.Optimizer supports all the features of CollaborativeOptimizer and then some.
  21. CollaborativeOptimizer will still be supported for awhile, but will eventually be deprecated.
  22. <br><br>
  23. .. automodule:: hivemind.optim.collaborative
  24. .. currentmodule:: hivemind.optim
  25. .. autoclass:: CollaborativeOptimizer
  26. :members: step
  27. :member-order: bysource
  28. .. autoclass:: CollaborativeAdaptiveOptimizer
  29. :members:
  30. :member-order: bysource