Browse Source

Update hivemind/client/averaging/training.py

Co-authored-by: Max Ryabinin <mryabinin0@gmail.com>
justheuristic 4 years ago
parent
commit
8a61b9f862
1 changed files with 1 additions and 1 deletions
  1. 1 1
      hivemind/client/averaging/training.py

+ 1 - 1
hivemind/client/averaging/training.py

@@ -25,7 +25,7 @@ class TrainingAverager(DecentralizedAverager):
     :param average_parameters: whether or not to average model parameters in self.step(...)
     :param average_parameters: whether or not to average model parameters in self.step(...)
     :param average_gradients: whether or not to average model gradients in self.step(...)
     :param average_gradients: whether or not to average model gradients in self.step(...)
     :param average_opt_statistics: if specified, average optimizer statistics with corresponding names in statedict
     :param average_opt_statistics: if specified, average optimizer statistics with corresponding names in statedict
-    :param scheduler: if specified, averager keeps scheduler state
+    :param scheduler: if specified, averager stores scheduler state
     :param initialize_optimizer: if True, this will run a speculative optimizer step with
     :param initialize_optimizer: if True, this will run a speculative optimizer step with
       zero gradients to initialize all tensors. If False, please initialize the optimizer state manually.
       zero gradients to initialize all tensors. If False, please initialize the optimizer state manually.
     :param extra_tensors: if specified, these extra tensors will also be averaged and shared in load_state_from_peers.
     :param extra_tensors: if specified, these extra tensors will also be averaged and shared in load_state_from_peers.