浏览代码

Use arxiv link for the paper, add "Privacy and security" discussion

Alexander Borzunov 3 年之前
父节点
当前提交
ac8d260ca9
共有 1 个文件被更改,包括 8 次插入2 次删除
  1. 8 2
      README.md

+ 8 - 2
README.md

@@ -17,7 +17,7 @@
 - Beyond traditional language model APIs — you can employ any fine-tuning and sampling methods by executing custom paths through the model or accessing its hidden states. This allows for the comforts of an API with the flexibility of PyTorch.
 
 <p align="center">
-    <b><a href="https://petals.ml/petals.pdf">[Read paper]</a></b> | <b><a href="https://petals.ml/">[View website]</a></b>
+    <b><a href="https://arxiv.org/pdf/2209.01188.pdf">[Read paper]</a></b> | <b><a href="https://petals.ml/">[View website]</a></b>
 </p>
 
 ## How it works?
@@ -28,10 +28,16 @@
 
 ### 🚧 This project is in active development
 
-Be careful: some features may not work, interfaces may change, and we have no detailed docs yet (see [roadmap](https://github.com/bigscience-workshop/petals/issues/12)).
+Please be careful: some features may not work, interfaces may change, and we have no detailed docs yet (see [roadmap](https://github.com/bigscience-workshop/petals/issues/12)).
 
 A stable version of the code and a public swarm open to everyone will be released in November 2022. You can [subscribe](https://petals.ml/) to be emailed when it happens or fill in [this form](https://forms.gle/TV3wtRPeHewjZ1vH9) to help the public launch by donating GPU time. In the meantime, you can launch and use your own private swarm.
 
+### 🔒 Privacy and security
+
+If you work with sensitive data, you should only use a private swarm (or a subset of servers in the public swarm) hosted by people and institutions you trust, who are authorized to process this data.
+
+This is important because peers serving the first layers of the model can use their inputs to recover input data (same with the last layers and model outputs). Also, if there are malicious peers, they may alter their outputs to influence the model outputs. See a more detailed discussion in Section 4 of our [paper](https://arxiv.org/pdf/2209.01188.pdf).
+
 ## Code examples
 
 Solving a sequence classification task via soft prompt tuning of BLOOM-176B: