oobabooga
|
501afbc234
|
Add requests to requirements.txt
|
2023-03-11 14:47:30 -03:00 |
|
oobabooga
|
fd540b8930
|
Use new LLaMA implementation (this will break stuff. I am sorry)
https://github.com/oobabooga/text-generation-webui/wiki/LLaMA-model
|
2023-03-09 17:59:15 -03:00 |
|
oobabooga
|
8660227e1b
|
Add top_k to RWKV
|
2023-03-07 17:24:28 -03:00 |
|
oobabooga
|
153dfeb4dd
|
Add --rwkv-cuda-on parameter, bump rwkv version
|
2023-03-06 20:12:54 -03:00 |
|
oobabooga
|
145c725c39
|
Bump RWKV version
|
2023-03-05 16:28:21 -03:00 |
|
oobabooga
|
5492e2e9f8
|
Add sentencepiece
|
2023-03-05 10:02:24 -03:00 |
|
oobabooga
|
c33715ad5b
|
Move towards HF LLaMA implementation
|
2023-03-05 01:20:31 -03:00 |
|
oobabooga
|
bcea196c9d
|
Bump flexgen version
|
2023-03-02 12:03:57 -03:00 |
|
oobabooga
|
7a9b4407b0
|
Settle for 0.0.6 for now
|
2023-03-01 17:37:14 -03:00 |
|
oobabooga
|
f351dce032
|
Keep rwkv up to date
|
2023-03-01 17:36:16 -03:00 |
|
oobabooga
|
9c86a1cd4a
|
Add RWKV pip package
|
2023-03-01 11:42:49 -03:00 |
|
oobabooga
|
b16f097466
|
Add FlexGen to requirements.txt
|
2023-02-27 08:58:07 -03:00 |
|
oobabooga
|
4548227fb5
|
Downgrade gradio version (file uploads are broken in 3.19.1)
|
2023-02-25 22:59:02 -03:00 |
|
oobabooga
|
32f40f3b42
|
Bump gradio version to 3.19.1
|
2023-02-25 17:20:03 -03:00 |
|
oobabooga
|
fe1771157f
|
Properly scrape huggingface for download links (for #122)
|
2023-02-24 14:06:42 -03:00 |
|
oobabooga
|
55bb5e5ef0
|
Bump accelerate version
|
2023-02-18 22:15:47 -03:00 |
|
oobabooga
|
a55e8836f6
|
Bump gradio version
It looks uglier, but the old one was bugged and unstable.
|
2023-02-15 20:20:56 -03:00 |
|
oobabooga
|
3277b751f5
|
Add softprompt support (for real this time)
Is this too much voodoo for our purposes?
|
2023-02-13 15:25:16 -03:00 |
|
oobabooga
|
b4fc8dfa8f
|
Add safetensors version
|
2023-02-04 18:58:17 -03:00 |
|
oobabooga
|
3dbebe30b1
|
Remove deepspeed requirement (only works on Linux for now)
|
2023-02-03 20:07:13 -03:00 |
|
oobabooga
|
03f084f311
|
Add safetensors support
|
2023-02-03 18:36:32 -03:00 |
|
oobabooga
|
03ebfba0fb
|
Bump bitsandbytes version
|
2023-02-03 09:29:31 -03:00 |
|
oobabooga
|
224be31a74
|
Use main bs4 package
|
2023-02-02 12:20:58 -03:00 |
|
oobabooga
|
cecaebc291
|
Add bs4 requirement (fixes #47)
|
2023-02-02 12:18:32 -03:00 |
|
oobabooga
|
d6b2d68527
|
Remove redundant requirements
|
2023-02-02 10:40:09 -03:00 |
|
81300
|
a97afa6965
|
Add DeepSpeed ZeRO-3 integration
|
2023-02-01 18:48:13 +02:00 |
|
oobabooga
|
414fa9d161
|
Revert transformers version (gpt-j and opt are broken)
|
2023-01-26 23:01:13 -03:00 |
|
Silver267
|
ad191e295b
|
bump transformers
idk but there seems to be a lot of fixes in the new version, and it's working according to my local tests.
|
2023-01-25 23:22:20 -05:00 |
|
oobabooga
|
fc73188ec7
|
Allow specifying your own profile picture in chat mode
|
2023-01-25 19:37:44 -03:00 |
|
oobabooga
|
3fa14befc5
|
Bump the gradio version, add back the queue
|
2023-01-25 16:10:35 -03:00 |
|
oobabooga
|
4067cecf67
|
Bump bitsandbytes version
|
2023-01-20 12:51:49 -03:00 |
|
oobabooga
|
fed7233ff4
|
Add script to download models
|
2023-01-06 19:57:31 -03:00 |
|
oobabooga
|
874cd6ff3f
|
Add the requirements.txt file (duh)
|
2023-01-06 00:25:33 -03:00 |
|