Kaynağa Gözat

Add colab-related changes (#80)

Add some stuff to work on COLAB more comfortable.

Co-authored-by: Alexander Borzunov <hxrussia@gmail.com>
Artem Chumachenko 2 yıl önce
ebeveyn
işleme
2cb82dd648
1 değiştirilmiş dosya ile 36 ekleme ve 4 silme
  1. 36 4
      examples/prompt-tuning-personachat.ipynb

+ 36 - 4
examples/prompt-tuning-personachat.ipynb

@@ -13,7 +13,9 @@
     "\n",
     "In this example, we show how to use [prompt tuning](https://aclanthology.org/2021.emnlp-main.243.pdf) to adapt a test 6B version of the [BLOOM](https://huggingface.co/bigscience/bloom) model for a specific downstream task. We will run this model in a decentralized fashion using [Petals](https://github.com/bigscience-workshop/petals). Petals servers will maintain the BLOOM blocks (they are kept unchanged during adaptation), and the gradient descent will learn a few prefix tokens stored on a Petals client.\n",
     "\n",
-    "We will adapt the BLOOM model for the chatbot task using the [Personachat](https://huggingface.co/datasets/bavard/personachat_truecased) dataset. For a given dialogue context, the model has to provide a relevant answer."
+    "We will adapt the BLOOM model for the chatbot task using the [Personachat](https://huggingface.co/datasets/bavard/personachat_truecased) dataset. For a given dialogue context, the model has to provide a relevant answer.\n",
+    "\n",
+    "To open this notebook in colab: [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/bigscience-workshop/petals/blob/main/examples/prompt-tuning-personachat.ipynb)"
    ]
   },
   {
@@ -24,6 +26,31 @@
     "First, we have to prepare all dependencies."
    ]
   },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "73bbc648",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "# This block is only need for colab users. It will change nothing if you are running this notebook locally.\n",
+    "import subprocess\n",
+    "import sys\n",
+    "\n",
+    "\n",
+    "IN_COLAB = 'google.colab' in sys.modules\n",
+    "\n",
+    "if IN_COLAB:\n",
+    "    subprocess.run(['git', 'clone', 'https://github.com/bigscience-workshop/petals'])\n",
+    "    subprocess.run(['pip', 'install', '-r', 'petals/requirements.txt'])\n",
+    "    subprocess.run(['pip', 'install', 'datasets', 'lib64'])\n",
+    "\n",
+    "    try:\n",
+    "        subprocess.check_output([\"nvidia-smi\", \"-L\"])\n",
+    "    except subprocess.CalledProcessError as e:\n",
+    "        subprocess.run(['rm', '-r', '/usr/local/cuda/lib64'])"
+   ]
+  },
   {
    "cell_type": "code",
    "execution_count": null,
@@ -33,7 +60,7 @@
    "source": [
     "import os\n",
     "import sys\n",
-    "sys.path.insert(0, \"..\")\n",
+    "sys.path.insert(0, \"..\") # for colab change to sys.path.insert(0, './petals/')\n",
     " \n",
     "import torch\n",
     "import transformers\n",
@@ -285,7 +312,7 @@
  ],
  "metadata": {
   "kernelspec": {
-   "display_name": "Python 3 (ipykernel)",
+   "display_name": "Python 3.8.10 64-bit",
    "language": "python",
    "name": "python3"
   },
@@ -299,7 +326,12 @@
    "name": "python",
    "nbconvert_exporter": "python",
    "pygments_lexer": "ipython3",
-   "version": "3.8.12"
+   "version": "3.8.9"
+  },
+  "vscode": {
+   "interpreter": {
+    "hash": "31f2aee4e71d21fbe5cf8b01ff0e069b9275f58929596ceb00d14d90e3e16cd6"
+   }
   }
  },
  "nbformat": 4,