oobabooga
fed3617f07
Move LLaMA 4-bit into a separate file
2023-03-12 11:12:34 -03:00
draff
28fd4fc970
Change wording to be consistent with other args
2023-03-10 23:34:13 +00:00
draff
001e638b47
Make it actually work
2023-03-10 23:28:19 +00:00
draff
804486214b
Re-implement --load-in-4bit and update --llama-bits arg description
2023-03-10 23:21:01 +00:00
ItsLogic
9ba8156a70
remove unnecessary Path()
2023-03-10 22:33:58 +00:00
draff
e6c631aea4
Replace --load-in-4bit with --llama-bits
...
Replaces --load-in-4bit with a more flexible --llama-bits arg to allow for 2 and 3 bit models as well. This commit also fixes a loading issue with .pt files which are not in the root of the models folder
2023-03-10 21:36:45 +00:00
oobabooga
026d60bd34
Remove default preset that didn't do anything
2023-03-10 14:01:02 -03:00
oobabooga
e01da4097c
Merge pull request #210 from rohvani/pt-path-changes
...
Add llama-65b-4bit.pt support
2023-03-10 11:04:56 -03:00
oobabooga
e9dbdafb14
Merge branch 'main' into pt-path-changes
2023-03-10 11:03:42 -03:00
oobabooga
706a03b2cb
Minor changes
2023-03-10 11:02:25 -03:00
oobabooga
de7dd8b6aa
Add comments
2023-03-10 10:54:08 -03:00
oobabooga
113b791aa5
Merge pull request #219 from deepdiffuser/4bit-multigpu
...
add multi-gpu support for 4bit gptq LLaMA
2023-03-10 10:52:45 -03:00
oobabooga
e461c0b7a0
Move the import to the top
2023-03-10 10:51:12 -03:00
deepdiffuser
9fbd60bf22
add no_split_module_classes to prevent tensor split error
2023-03-10 05:30:47 -08:00
deepdiffuser
ab47044459
add multi-gpu support for 4bit gptq LLaMA
2023-03-10 04:52:45 -08:00
rohvani
2ac2913747
fix reference issue
2023-03-09 20:13:23 -08:00
oobabooga
1d7e893fa1
Merge pull request #211 from zoidbb/add-tokenizer-to-hf-downloads
...
download tokenizer when present
2023-03-10 00:46:21 -03:00
oobabooga
875847bf88
Consider tokenizer a type of text
2023-03-10 00:45:28 -03:00
oobabooga
8ed214001d
Merge branch 'main' of github.com:oobabooga/text-generation-webui
2023-03-10 00:42:09 -03:00
oobabooga
249c268176
Fix the download script for long lists of files on HF
2023-03-10 00:41:10 -03:00
Ber Zoidberg
ec3de0495c
download tokenizer when present
2023-03-09 19:08:09 -08:00
rohvani
5ee376c580
add LLaMA preset
2023-03-09 18:31:41 -08:00
rohvani
826e297b0e
add llama-65b-4bit support & multiple pt paths
2023-03-09 18:31:32 -08:00
oobabooga
7c3d1b43c1
Merge pull request #204 from MichealC0/patch-1
...
Update README.md
2023-03-09 23:04:09 -03:00
oobabooga
9849aac0f1
Don't show .pt models in the list
2023-03-09 21:54:50 -03:00
oobabooga
1a3d25f75d
Merge pull request #206 from oobabooga/llama-4bit
...
Add LLaMA 4-bit support
2023-03-09 21:07:32 -03:00
oobabooga
eb0cb9b6df
Update README
2023-03-09 20:53:52 -03:00
oobabooga
74102d5ee4
Insert to the path instead of appending
2023-03-09 20:51:22 -03:00
oobabooga
2965aa1625
Check if the .pt file exists
2023-03-09 20:48:51 -03:00
oobabooga
d41e3c233b
Update README.md
2023-03-09 18:02:44 -03:00
oobabooga
fd540b8930
Use new LLaMA implementation (this will break stuff. I am sorry)
...
https://github.com/oobabooga/text-generation-webui/wiki/LLaMA-model
2023-03-09 17:59:15 -03:00
oobabooga
828a524f9a
Add LLaMA 4-bit support
2023-03-09 15:50:26 -03:00
oobabooga
33414478bf
Update README
2023-03-09 11:13:03 -03:00
oobabooga
e7adf5fe4e
Add Contrastive Search preset #197
2023-03-09 10:27:11 -03:00
oobabooga
557c773df7
Merge pull request #201 from jtang613/Name_It
...
Lets propose a name besides "Gradio"
2023-03-09 09:45:47 -03:00
oobabooga
038e90765b
Rename to "Text generation web UI"
2023-03-09 09:44:08 -03:00
Chimdumebi Nebolisa
4dd14dcab4
Update README.md
2023-03-09 10:22:09 +01:00
jtang613
807a41cf87
Lets propose a name besides "Gradio"
2023-03-08 21:02:25 -05:00
oobabooga
c09f416adb
Change the Naive preset
...
(again)
2023-03-07 23:17:13 -03:00
oobabooga
8e89bc596b
Fix encode() for RWKV
2023-03-07 23:15:46 -03:00
oobabooga
44e6d82185
Remove unused imports
2023-03-07 22:56:15 -03:00
oobabooga
19a34941ed
Add proper streaming to RWKV
2023-03-07 18:17:56 -03:00
oobabooga
8660227e1b
Add top_k to RWKV
2023-03-07 17:24:28 -03:00
oobabooga
827ae51f72
Sort the imports
2023-03-07 00:23:36 -03:00
oobabooga
b4bfd87319
Update README.md
2023-03-06 20:55:01 -03:00
oobabooga
8f4a197c05
Add credits
2023-03-06 20:34:36 -03:00
oobabooga
d0e8780555
Update README.md
2023-03-06 20:17:59 -03:00
oobabooga
18eaeebbfe
Merge branch 'main' of github.com:oobabooga/text-generation-webui
2023-03-06 20:16:03 -03:00
oobabooga
18ccfcd7fe
Update README.md
2023-03-06 20:15:55 -03:00
oobabooga
153dfeb4dd
Add --rwkv-cuda-on parameter, bump rwkv version
2023-03-06 20:12:54 -03:00