Remove RWKV from requirements.txt

This commit is contained in:
oobabooga 2023-05-23 20:48:12 -03:00
parent 7aed53559a
commit 1490c0af68
2 changed files with 9 additions and 2 deletions

View File

@ -18,6 +18,14 @@ There is a bug in transformers==4.29.2 that prevents RWKV from being loaded in 8
The instructions below are from before RWKV was supported in transformers, and they are kept for legacy purposes. The old implementation is possibly faster, but it lacks the full range of samplers that the transformers library offers. The instructions below are from before RWKV was supported in transformers, and they are kept for legacy purposes. The old implementation is possibly faster, but it lacks the full range of samplers that the transformers library offers.
#### 0. Install the RWKV library
```
pip install rwkv
```
`0.7.3` was the last version that I tested. If you experience any issues, try ```pip install rwkv==0.7.3```.
#### 1. Download the model #### 1. Download the model
It is available in different sizes: It is available in different sizes:

View File

@ -10,12 +10,11 @@ pandas
Pillow>=9.5.0 Pillow>=9.5.0
pyyaml pyyaml
requests requests
rwkv==0.7.3
safetensors==0.3.1 safetensors==0.3.1
sentencepiece sentencepiece
transformers==4.29.2
tqdm tqdm
git+https://github.com/huggingface/peft@4fd374e80d670781c0d82c96ce94d1215ff23306 git+https://github.com/huggingface/peft@4fd374e80d670781c0d82c96ce94d1215ff23306
transformers==4.29.2
bitsandbytes==0.38.1; platform_system != "Windows" bitsandbytes==0.38.1; platform_system != "Windows"
llama-cpp-python==0.1.51; platform_system != "Windows" llama-cpp-python==0.1.51; platform_system != "Windows"
https://github.com/abetlen/llama-cpp-python/releases/download/v0.1.51/llama_cpp_python-0.1.51-cp310-cp310-win_amd64.whl; platform_system == "Windows" https://github.com/abetlen/llama-cpp-python/releases/download/v0.1.51/llama_cpp_python-0.1.51-cp310-cp310-win_amd64.whl; platform_system == "Windows"