mirror of
https://github.com/oobabooga/text-generation-webui.git
synced 2024-11-21 23:57:58 +01:00
Remove RWKV from requirements.txt
This commit is contained in:
parent
7aed53559a
commit
1490c0af68
@ -18,6 +18,14 @@ There is a bug in transformers==4.29.2 that prevents RWKV from being loaded in 8
|
||||
|
||||
The instructions below are from before RWKV was supported in transformers, and they are kept for legacy purposes. The old implementation is possibly faster, but it lacks the full range of samplers that the transformers library offers.
|
||||
|
||||
#### 0. Install the RWKV library
|
||||
|
||||
```
|
||||
pip install rwkv
|
||||
```
|
||||
|
||||
`0.7.3` was the last version that I tested. If you experience any issues, try ```pip install rwkv==0.7.3```.
|
||||
|
||||
#### 1. Download the model
|
||||
|
||||
It is available in different sizes:
|
||||
|
@ -10,12 +10,11 @@ pandas
|
||||
Pillow>=9.5.0
|
||||
pyyaml
|
||||
requests
|
||||
rwkv==0.7.3
|
||||
safetensors==0.3.1
|
||||
sentencepiece
|
||||
transformers==4.29.2
|
||||
tqdm
|
||||
git+https://github.com/huggingface/peft@4fd374e80d670781c0d82c96ce94d1215ff23306
|
||||
transformers==4.29.2
|
||||
bitsandbytes==0.38.1; platform_system != "Windows"
|
||||
llama-cpp-python==0.1.51; platform_system != "Windows"
|
||||
https://github.com/abetlen/llama-cpp-python/releases/download/v0.1.51/llama_cpp_python-0.1.51-cp310-cp310-win_amd64.whl; platform_system == "Windows"
|
||||
|
Loading…
Reference in New Issue
Block a user