Selaa lähdekoodia

use default prefix in readme

justheuristic 3 vuotta sitten
vanhempi
commit
2e90ac30a0
1 muutettua tiedostoa jossa 6 lisäystä ja 6 poistoa
  1. 6 6
      README.md

+ 6 - 6
README.md

@@ -37,7 +37,7 @@ python -m cli.inference_one_block --config cli/config.json  # see other args
 First, run one or more servers like this:
 First, run one or more servers like this:
 ```bash
 ```bash
 # minimalistic server with non-trained bloom blocks
 # minimalistic server with non-trained bloom blocks
-python -m cli.run_server --prefix bloom6b3 --converted_model_name_or_path bigscience/test-bloomd-6b3 \
+python -m cli.run_server --converted_model_name_or_path bigscience/test-bloomd-6b3 \
   --block_indices 3:5 --torch_dtype float32 --identity_path ./server1.id --host_maddrs /ip4/127.0.0.1/tcp/31337
   --block_indices 3:5 --torch_dtype float32 --identity_path ./server1.id --host_maddrs /ip4/127.0.0.1/tcp/31337
 # when running multiple servers:
 # when running multiple servers:
 # - give each server a unique --identity_path (or remote --identity_path arg when debugging)
 # - give each server a unique --identity_path (or remote --identity_path arg when debugging)
@@ -58,7 +58,7 @@ dht = hivemind.DHT(
     client_mode=True, start=True,
     client_mode=True, start=True,
 )
 )
 
 
-layer3, layer4 = get_remote_module(dht, ['bloom6b3.3', 'bloom6b3.4'])
+layer3, layer4 = get_remote_module(dht, ['bigscience/test-bloomd-6b3.3', 'bigscience/test-bloomd-6b3.4'])
 assert layer3 is not None and layer4 is not None, "one or both layers were not found in DHT"
 assert layer3 is not None and layer4 is not None, "one or both layers were not found in DHT"
 # test forward/backward, two blocks
 # test forward/backward, two blocks
 outputs, = layer4(*layer3(torch.randn(1, 64, 4096)))
 outputs, = layer4(*layer3(torch.randn(1, 64, 4096)))
@@ -88,14 +88,14 @@ python -m cli.convert_model --model bigscience/bloom-6b3  \
 To test distributed inference, run one or more servers, then open a new shell and run pytest with environment variables:
 To test distributed inference, run one or more servers, then open a new shell and run pytest with environment variables:
 ```bash
 ```bash
 # shell A: serve blocks 3 and 4
 # shell A: serve blocks 3 and 4
-python -m cli.run_server --prefix bloom6b3 --converted_model_name_or_path bigscience/test-bloomd-6b3 \
+python -m cli.run_server --converted_model_name_or_path bigscience/test-bloomd-6b3 \
   --block_indices 3:5 --torch_dtype float32 --identity_path ./server1.id --host_maddrs /ip4/127.0.0.1/tcp/31337
   --block_indices 3:5 --torch_dtype float32 --identity_path ./server1.id --host_maddrs /ip4/127.0.0.1/tcp/31337
 
 
 # shell B: connect to the swarm and test individual blocks for exact match
 # shell B: connect to the swarm and test individual blocks for exact match
 export PYTHONPATH=. INITIAL_PEERS="/ip4/TODO_COPY_INITIAL_PEERS_FROM_SERVER_OUTPUT"
 export PYTHONPATH=. INITIAL_PEERS="/ip4/TODO_COPY_INITIAL_PEERS_FROM_SERVER_OUTPUT"
-BLOCK_UID=bloom6b3.3 pytest tests/test_block_exact_match.py
-BLOCK_UID=bloom6b3.4 pytest tests/test_block_exact_match.py
+BLOCK_UID=bigscience/test-bloomd-6b3.3 pytest tests/test_block_exact_match.py
+BLOCK_UID=bigscience/test-bloomd-6b3.4 pytest tests/test_block_exact_match.py
 
 
 # the test below will fail because there is no server that serves layer 7
 # the test below will fail because there is no server that serves layer 7
-# BLOCK_UID=bloom6b3.7 pytest tests/test_block_exact_match.py
+# BLOCK_UID=bigscience/test-bloomd-6b3.7 pytest tests/test_block_exact_match.py
 ```
 ```