deepdiffuser
|
ab47044459
|
add multi-gpu support for 4bit gptq LLaMA
|
2023-03-10 04:52:45 -08:00 |
|
oobabooga
|
1d7e893fa1
|
Merge pull request #211 from zoidbb/add-tokenizer-to-hf-downloads
download tokenizer when present
|
2023-03-10 00:46:21 -03:00 |
|
oobabooga
|
875847bf88
|
Consider tokenizer a type of text
|
2023-03-10 00:45:28 -03:00 |
|
oobabooga
|
8ed214001d
|
Merge branch 'main' of github.com:oobabooga/text-generation-webui
|
2023-03-10 00:42:09 -03:00 |
|
oobabooga
|
249c268176
|
Fix the download script for long lists of files on HF
|
2023-03-10 00:41:10 -03:00 |
|
Ber Zoidberg
|
ec3de0495c
|
download tokenizer when present
|
2023-03-09 19:08:09 -08:00 |
|
oobabooga
|
7c3d1b43c1
|
Merge pull request #204 from MichealC0/patch-1
Update README.md
|
2023-03-09 23:04:09 -03:00 |
|
oobabooga
|
9849aac0f1
|
Don't show .pt models in the list
|
2023-03-09 21:54:50 -03:00 |
|
oobabooga
|
1a3d25f75d
|
Merge pull request #206 from oobabooga/llama-4bit
Add LLaMA 4-bit support
|
2023-03-09 21:07:32 -03:00 |
|
oobabooga
|
eb0cb9b6df
|
Update README
|
2023-03-09 20:53:52 -03:00 |
|
oobabooga
|
74102d5ee4
|
Insert to the path instead of appending
|
2023-03-09 20:51:22 -03:00 |
|
oobabooga
|
2965aa1625
|
Check if the .pt file exists
|
2023-03-09 20:48:51 -03:00 |
|
oobabooga
|
d41e3c233b
|
Update README.md
|
2023-03-09 18:02:44 -03:00 |
|
oobabooga
|
fd540b8930
|
Use new LLaMA implementation (this will break stuff. I am sorry)
https://github.com/oobabooga/text-generation-webui/wiki/LLaMA-model
|
2023-03-09 17:59:15 -03:00 |
|
oobabooga
|
828a524f9a
|
Add LLaMA 4-bit support
|
2023-03-09 15:50:26 -03:00 |
|
oobabooga
|
33414478bf
|
Update README
|
2023-03-09 11:13:03 -03:00 |
|
oobabooga
|
e7adf5fe4e
|
Add Contrastive Search preset #197
|
2023-03-09 10:27:11 -03:00 |
|
oobabooga
|
557c773df7
|
Merge pull request #201 from jtang613/Name_It
Lets propose a name besides "Gradio"
|
2023-03-09 09:45:47 -03:00 |
|
oobabooga
|
038e90765b
|
Rename to "Text generation web UI"
|
2023-03-09 09:44:08 -03:00 |
|
Chimdumebi Nebolisa
|
4dd14dcab4
|
Update README.md
|
2023-03-09 10:22:09 +01:00 |
|
jtang613
|
807a41cf87
|
Lets propose a name besides "Gradio"
|
2023-03-08 21:02:25 -05:00 |
|
oobabooga
|
c09f416adb
|
Change the Naive preset
(again)
|
2023-03-07 23:17:13 -03:00 |
|
oobabooga
|
8e89bc596b
|
Fix encode() for RWKV
|
2023-03-07 23:15:46 -03:00 |
|
oobabooga
|
44e6d82185
|
Remove unused imports
|
2023-03-07 22:56:15 -03:00 |
|
oobabooga
|
19a34941ed
|
Add proper streaming to RWKV
|
2023-03-07 18:17:56 -03:00 |
|
oobabooga
|
8660227e1b
|
Add top_k to RWKV
|
2023-03-07 17:24:28 -03:00 |
|
oobabooga
|
827ae51f72
|
Sort the imports
|
2023-03-07 00:23:36 -03:00 |
|
oobabooga
|
b4bfd87319
|
Update README.md
|
2023-03-06 20:55:01 -03:00 |
|
oobabooga
|
8f4a197c05
|
Add credits
|
2023-03-06 20:34:36 -03:00 |
|
oobabooga
|
d0e8780555
|
Update README.md
|
2023-03-06 20:17:59 -03:00 |
|
oobabooga
|
18eaeebbfe
|
Merge branch 'main' of github.com:oobabooga/text-generation-webui
|
2023-03-06 20:16:03 -03:00 |
|
oobabooga
|
18ccfcd7fe
|
Update README.md
|
2023-03-06 20:15:55 -03:00 |
|
oobabooga
|
153dfeb4dd
|
Add --rwkv-cuda-on parameter, bump rwkv version
|
2023-03-06 20:12:54 -03:00 |
|
oobabooga
|
4143d4aa48
|
Merge pull request #175 from SagsMug/main
Add api example using websockets
|
2023-03-06 19:53:48 -03:00 |
|
oobabooga
|
8b882c132a
|
tabs -> spaces
|
2023-03-06 19:52:26 -03:00 |
|
oobabooga
|
eebec65075
|
Improve readability
|
2023-03-06 19:46:46 -03:00 |
|
oobabooga
|
99f69dfcaa
|
Merge pull request #169 from MetaIX/patch-1
Support for Eleven Labs TTS
|
2023-03-06 19:40:31 -03:00 |
|
oobabooga
|
944fdc03b2
|
Rename the folder
|
2023-03-06 19:38:36 -03:00 |
|
oobabooga
|
49ae183ac9
|
Move new extension to a separate file
|
2023-03-06 19:28:53 -03:00 |
|
oobabooga
|
91823e1ed1
|
Update README.md
|
2023-03-06 16:48:31 -03:00 |
|
oobabooga
|
6904a507c6
|
Change some parameters
|
2023-03-06 16:29:43 -03:00 |
|
oobabooga
|
20bd645f6a
|
Fix bug in multigpu setups (attempt 3)
|
2023-03-06 15:58:18 -03:00 |
|
oobabooga
|
09a7c36e1b
|
Minor improvement while running custom models
|
2023-03-06 15:36:35 -03:00 |
|
oobabooga
|
24c4c20391
|
Fix bug in multigpu setups (attempt #2)
|
2023-03-06 15:23:29 -03:00 |
|
oobabooga
|
d88b7836c6
|
Fix bug in multigpu setups
|
2023-03-06 14:58:30 -03:00 |
|
oobabooga
|
5bed607b77
|
Increase repetition frequency/penalty for RWKV
|
2023-03-06 14:25:48 -03:00 |
|
oobabooga
|
aa7ce0665e
|
Merge branch 'main' of github.com:oobabooga/text-generation-webui
|
2023-03-06 10:58:41 -03:00 |
|
oobabooga
|
bf56b6c1fb
|
Load settings.json without the need for --settings settings.json
This is for setting UI defaults
|
2023-03-06 10:57:45 -03:00 |
|
oobabooga
|
2de9f122cd
|
Update README.md
|
2023-03-06 09:34:49 -03:00 |
|
oobabooga
|
e91f4bc25a
|
Add RWKV tokenizer
|
2023-03-06 08:45:49 -03:00 |
|