Commit History

Author SHA1 Message Date
  Alexander Borzunov 26ebbfe8f0 Support macOS (#477) 2 years ago
  Alexander Borzunov 915b357740 Require transformers>=4.32.0 (#479) 2 years ago
  Alexander Borzunov 18e93afc73 Don't install cpufeature on non-x86_64 machines (#478) 2 years ago
  Artem Chumachenko a14ae7334d Update peft to 0.5.0 version (#475) 2 years ago
  justheuristic 4f850996bb Change transformers version assert (#472) 2 years ago
  justheuristic 9250025140 Support transformers 4.32.x (#471) 2 years ago
  justheuristic adda5f8c20 Temporarily require peft<0.5.0, transformers<4.32.0 (#470) 2 years ago
  Alexander Borzunov 593d980ad8 Use bitsandbytes 0.41.1 (#442) 2 years ago
  Alexander Borzunov f3fafd14a4 Bump version to 2.0.1 (#411) 2 years ago
  Alexander Borzunov eb0664b993 Support Python 3.11 (#393) 2 years ago
  Alexander Borzunov e9a20e7e53 Require accelerate>=0.20.3 as transformers do (#383) 2 years ago
  Alexander Borzunov 895327a0ae Fix readme code example, require Python < 3.11 until supported (#374) 2 years ago
  Alexander Borzunov c735dd7ba3 Update transformers to 4.31.0 and peft to 0.4.0 (#371) 2 years ago
  Alexander Borzunov f97582fb5f Require transformers < 4.31.0 until we're compatible (#369) 2 years ago
  Alexander Borzunov 62d9ed5ce7 Implement shortest-path routing for inference (#362) 2 years ago
  Alexander Borzunov 3f733a96e3 Use bitsandbytes 0.40.1.post1 (#357) 2 years ago
  Alexander Borzunov 2c8959e713 Share more info about a server in DHT (#355) 2 years ago
  Alexander Borzunov 1a78638c02 Test that bitsandbytes is not imported when it's not used (#351) 2 years ago
  Artem Chumachenko b9f0a5467f Support peft LoRA adapters (#335) 2 years ago
  Alexander Borzunov dfc6578c8e Use bitsandbytes 0.40.0.post4 with bias hotfix (#342) 2 years ago
  Alexander Borzunov fa095f6461 Use 4-bit for llama by default, use bitsandbytes 0.40.0.post3 (#340) 2 years ago
  Alexander Borzunov de930918a0 Support loading blocks in 4-bit (QLoRA NF4 format, disabled by default) (#333) 2 years ago
  Alexander Borzunov 66a47c763e Require pydantic < 2.0 (2.0 is incompatible with hivemind 1.1.8) (#337) 2 years ago
  Alexander Borzunov cb3f018f9f Add LLaMA support (#323) 2 years ago
  Alexander Borzunov 0a313bf6c5 Update hivemind to 1.1.8, enable efficient bfloat16 encoding (#311) 2 years ago
  Alexander Borzunov 454c193863 Fix OOMs happening in case of accelerate >= 0.16.0 (#310) 2 years ago
  Alexander Borzunov 98be9ffe4c Relax the rest of Hugging Face dependencies (#305) 2 years ago
  Alexander Borzunov 35662b4a16 Require bitsandbytes == 0.38.0.post2, hivemind == 1.1.7 (#302) 2 years ago
  Alexander Borzunov 2116df08bc Fix deps, enable 8-bit by default for TP (#298) 2 years ago
  justheuristic 987f4d2b2f Update bitsandbytes, hivemind, transformers (#290) 2 years ago