Commit Graph

315 Commits

Author SHA1 Message Date
oobabooga
88fa38ac01
Update README.md 2023-04-01 14:49:03 -03:00
oobabooga
4b57bd0d99
Update README.md 2023-04-01 14:38:04 -03:00
oobabooga
b53bec5a1f
Update README.md 2023-04-01 14:37:35 -03:00
oobabooga
9160586c04
Update README.md 2023-04-01 14:31:10 -03:00
oobabooga
7ec11ae000
Update README.md 2023-04-01 14:15:19 -03:00
oobabooga
012f4f83b8
Update README.md 2023-04-01 13:55:15 -03:00
oobabooga
2c52310642 Add --threads flag for llama.cpp 2023-03-31 21:18:05 -03:00
oobabooga
cbfe0b944a
Update README.md 2023-03-31 17:49:11 -03:00
oobabooga
5c4e44b452
llama.cpp documentation 2023-03-31 15:20:39 -03:00
oobabooga
d4a9b5ea97 Remove redundant preset (see the plot in #587) 2023-03-30 17:34:44 -03:00
oobabooga
41b58bc47e
Update README.md 2023-03-29 11:02:29 -03:00
oobabooga
3b4447a4fe
Update README.md 2023-03-29 02:24:11 -03:00
oobabooga
5d0b83c341
Update README.md 2023-03-29 02:22:19 -03:00
oobabooga
c2a863f87d
Mention the updated one-click installer 2023-03-29 02:11:51 -03:00
oobabooga
010b259dde Update documentation 2023-03-28 17:46:00 -03:00
oobabooga
036163a751 Change description 2023-03-27 23:39:26 -03:00
oobabooga
30585b3e71 Update README 2023-03-27 23:35:01 -03:00
oobabooga
49c10c5570
Add support for the latest GPTQ models with group-size (#530)
**Warning: old 4-bit weights will not work anymore!**

See here how to get up to date weights: https://github.com/oobabooga/text-generation-webui/wiki/LLaMA-model#step-2-get-the-pre-converted-weights
2023-03-26 00:11:33 -03:00
oobabooga
70f9565f37
Update README.md 2023-03-25 02:35:30 -03:00
oobabooga
04417b658b
Update README.md 2023-03-24 01:40:43 -03:00
oobabooga
143b5b5edf
Mention one-click-bandaid in the README 2023-03-23 23:28:50 -03:00
oobabooga
6872ffd976
Update README.md 2023-03-20 16:53:14 -03:00
oobabooga
dd4374edde Update README 2023-03-19 20:15:15 -03:00
oobabooga
9378754cc7 Update README 2023-03-19 20:14:50 -03:00
oobabooga
7ddf6147ac
Update README.md 2023-03-19 19:25:52 -03:00
oobabooga
ddb62470e9 --no-cache and --gpu-memory in MiB for fine VRAM control 2023-03-19 19:21:41 -03:00
oobabooga
0cbe2dd7e9
Update README.md 2023-03-18 12:24:54 -03:00
oobabooga
d2a7fac8ea
Use pip instead of conda for pytorch 2023-03-18 11:56:04 -03:00
oobabooga
a0b1a30fd5
Specify torchvision/torchaudio versions 2023-03-18 11:23:56 -03:00
oobabooga
a163807f86
Update README.md 2023-03-18 03:07:27 -03:00
oobabooga
a7acfa4893
Update README.md 2023-03-17 22:57:46 -03:00
oobabooga
dc35861184
Update README.md 2023-03-17 21:05:17 -03:00
oobabooga
f2a5ca7d49
Update README.md 2023-03-17 20:50:27 -03:00
oobabooga
8c8286b0e6
Update README.md 2023-03-17 20:49:40 -03:00
oobabooga
0c05e65e5c
Update README.md 2023-03-17 20:25:42 -03:00
oobabooga
66e8d12354
Update README.md 2023-03-17 19:59:37 -03:00
oobabooga
9a871117d7
Update README.md 2023-03-17 19:52:22 -03:00
oobabooga
d4f38b6a1f
Update README.md 2023-03-17 18:57:48 -03:00
oobabooga
ad7c829953
Update README.md 2023-03-17 18:55:01 -03:00
oobabooga
4426f941e0
Update the installation instructions. Tldr use WSL 2023-03-17 18:51:07 -03:00
oobabooga
ebef4a510b Update README 2023-03-17 11:58:45 -03:00
oobabooga
cdfa787bcb Update README 2023-03-17 11:53:28 -03:00
oobabooga
dd1c5963da Update README 2023-03-16 12:45:27 -03:00
oobabooga
445ebf0ba8
Update README.md 2023-03-15 20:06:46 -03:00
oobabooga
09045e4bdb
Add WSL guide 2023-03-15 19:42:06 -03:00
oobabooga
128d18e298
Update README.md 2023-03-14 17:57:25 -03:00
oobabooga
1236c7f971
Update README.md 2023-03-14 17:56:15 -03:00
oobabooga
b419dffba3
Update README.md 2023-03-14 17:55:35 -03:00
oobabooga
87192e2813 Update README 2023-03-14 08:02:21 -03:00
oobabooga
3da73e409f Merge branch 'main' into Zerogoki00-opt4-bit 2023-03-14 07:50:36 -03:00
Ayanami Rei
b746250b2f Update README 2023-03-13 20:20:45 +03:00
oobabooga
66b6971b61 Update README 2023-03-13 12:44:18 -03:00
oobabooga
ddea518e0f Document --auto-launch 2023-03-13 12:43:33 -03:00
oobabooga
d97bfb8713
Update README.md 2023-03-13 12:39:33 -03:00
oobabooga
bdff37f0bb
Update README.md 2023-03-13 11:05:51 -03:00
oobabooga
d168b6e1f7
Update README.md 2023-03-12 17:54:07 -03:00
oobabooga
54e8f0c31f
Update README.md 2023-03-12 16:58:00 -03:00
oobabooga
3375eaece0 Update README 2023-03-12 15:01:32 -03:00
oobabooga
17210ff88f
Update README.md 2023-03-12 14:31:24 -03:00
oobabooga
4dc1d8c091
Update README.md 2023-03-12 12:46:53 -03:00
oobabooga
89e9493509 Update README 2023-03-12 11:23:20 -03:00
draff
28fd4fc970 Change wording to be consistent with other args 2023-03-10 23:34:13 +00:00
draff
804486214b Re-implement --load-in-4bit and update --llama-bits arg description 2023-03-10 23:21:01 +00:00
draff
e6c631aea4 Replace --load-in-4bit with --llama-bits
Replaces --load-in-4bit with a more flexible --llama-bits arg to allow for 2 and 3 bit models as well. This commit also fixes a loading issue with .pt files which are not in the root of the models folder
2023-03-10 21:36:45 +00:00
oobabooga
7c3d1b43c1
Merge pull request #204 from MichealC0/patch-1
Update README.md
2023-03-09 23:04:09 -03:00
oobabooga
1a3d25f75d
Merge pull request #206 from oobabooga/llama-4bit
Add LLaMA 4-bit support
2023-03-09 21:07:32 -03:00
oobabooga
eb0cb9b6df Update README 2023-03-09 20:53:52 -03:00
oobabooga
d41e3c233b
Update README.md 2023-03-09 18:02:44 -03:00
oobabooga
33414478bf Update README 2023-03-09 11:13:03 -03:00
oobabooga
e7adf5fe4e Add Contrastive Search preset #197 2023-03-09 10:27:11 -03:00
Chimdumebi Nebolisa
4dd14dcab4
Update README.md 2023-03-09 10:22:09 +01:00
oobabooga
b4bfd87319
Update README.md 2023-03-06 20:55:01 -03:00
oobabooga
d0e8780555
Update README.md 2023-03-06 20:17:59 -03:00
oobabooga
18ccfcd7fe
Update README.md 2023-03-06 20:15:55 -03:00
oobabooga
91823e1ed1
Update README.md 2023-03-06 16:48:31 -03:00
oobabooga
aa7ce0665e Merge branch 'main' of github.com:oobabooga/text-generation-webui 2023-03-06 10:58:41 -03:00
oobabooga
bf56b6c1fb Load settings.json without the need for --settings settings.json
This is for setting UI defaults
2023-03-06 10:57:45 -03:00
oobabooga
2de9f122cd
Update README.md 2023-03-06 09:34:49 -03:00
oobabooga
736f61610b Update README 2023-03-04 01:33:52 -03:00
oobabooga
76378c6cc2 Update README 2023-03-02 11:27:15 -03:00
oobabooga
f4b130e2bd
Update README.md 2023-02-27 15:15:45 -03:00
oobabooga
c183d2917c
Update README.md 2023-02-26 00:59:07 -03:00
oobabooga
cfe010b244
Update README.md 2023-02-26 00:54:37 -03:00
oobabooga
87d9f3e329
Update README.md 2023-02-26 00:54:19 -03:00
oobabooga
955997a90b
Update README.md 2023-02-26 00:54:07 -03:00
oobabooga
c593dfa827
Update README.md 2023-02-25 18:57:34 -03:00
oobabooga
7872a64f78
Update README.md 2023-02-25 18:56:43 -03:00
oobabooga
88cfc84ddb Update README 2023-02-25 01:33:26 -03:00
oobabooga
0b90e0b3b6
Update README.md 2023-02-24 12:01:07 -03:00
oobabooga
1a23e6d185
Add Pythia to README 2023-02-24 11:38:01 -03:00
oobabooga
f4f508c8e2
Update README.md 2023-02-24 09:03:09 -03:00
oobabooga
ced5d9ab04
Update README.md 2023-02-23 10:04:07 -03:00
oobabooga
b18071330f
Update README.md 2023-02-23 01:32:05 -03:00
oobabooga
b4a7f5fa70
Update README.md 2023-02-22 01:54:12 -03:00
oobabooga
e195377050 Deprecate torch dumps, move to safetensors (they load even faster) 2023-02-20 15:03:19 -03:00
oobabooga
58520a1f75
Update README.md 2023-02-20 12:44:31 -03:00
oobabooga
05e9da0c12
Update README.md 2023-02-18 22:34:51 -03:00
oobabooga
b1add0e586
Update README.md 2023-02-18 22:32:16 -03:00
oobabooga
348acdf626 Mention deepspeed in the README 2023-02-16 17:29:48 -03:00
oobabooga
05b53e4626 Update README 2023-02-15 14:43:34 -03:00