oobabooga
|
56b5a4af74
|
exllamav2 typical_p
|
2023-09-28 20:10:12 -07:00 |
|
oobabooga
|
f8e9733412
|
Minor syntax change
|
2023-09-28 19:32:35 -07:00 |
|
oobabooga
|
f931184b53
|
Increase truncation limits to 32768
|
2023-09-28 19:28:22 -07:00 |
|
oobabooga
|
1dd13e4643
|
Read Transformers config.json metadata
|
2023-09-28 19:19:47 -07:00 |
|
oobabooga
|
9ccaf5eebb
|
I forgot to add the file
|
2023-09-28 18:25:58 -07:00 |
|
oobabooga
|
92a39c619b
|
Add Mistral support
|
2023-09-28 15:41:03 -07:00 |
|
oobabooga
|
f46ba12b42
|
Add flash-attn wheels for Linux
|
2023-09-28 14:45:52 -07:00 |
|
oobabooga
|
771e936769
|
Fix extensions install (2nd attempt)
|
2023-09-28 14:33:49 -07:00 |
|
快乐的我531
|
4e56ad55e1
|
Let model downloader download *.tiktoken as well (#4121)
|
2023-09-28 18:03:18 -03:00 |
|
oobabooga
|
822ba7fcbb
|
Better error handling during install/update
|
2023-09-28 13:57:59 -07:00 |
|
oobabooga
|
85f45cafa1
|
Fix extensions install
|
2023-09-28 13:54:36 -07:00 |
|
Nathan Thomas
|
e145d9a0da
|
Update one_click.py to initialize site_packages_path variable (#4118)
|
2023-09-28 08:31:29 -03:00 |
|
Chenxiao Wang
|
3fb1e0236a
|
fix: update superboogav2 requirements.txt (#4100)
|
2023-09-27 23:45:59 -03:00 |
|
jllllll
|
2bd23c29cb
|
Bump llama-cpp-python to 0.2.7 (#4110)
|
2023-09-27 23:45:36 -03:00 |
|
missionfloyd
|
86e7c05429
|
Delete extensions/Training_PRO/readme.md (#4112)
|
2023-09-27 23:45:13 -03:00 |
|
Sam
|
a0d99dcf90
|
fix: add missing superboogav2 dep (#4099)
|
2023-09-26 23:37:22 -03:00 |
|
StoyanStAtanasov
|
7e6ff8d1f0
|
Enable NUMA feature for llama_cpp_python (#4040)
|
2023-09-26 22:05:00 -03:00 |
|
oobabooga
|
87ea2d96fd
|
Add a note about RWKV loader
|
2023-09-26 17:43:39 -07:00 |
|
jllllll
|
13a54729b1
|
Bump exllamav2 to 0.0.4 and use pre-built wheels (#4095)
|
2023-09-26 21:36:14 -03:00 |
|
jllllll
|
3879ab5007
|
Expand MacOS llama.cpp support in requirements (#4094)
Provides MacOS 12 and 13 wheels.
|
2023-09-26 21:34:48 -03:00 |
|
jllllll
|
9d9aa38234
|
Fix old install migration for WSL installer (#4093)
|
2023-09-26 21:34:16 -03:00 |
|
HideLord
|
0845724a89
|
Supercharging superbooga (#3272)
|
2023-09-26 21:30:19 -03:00 |
|
jllllll
|
ad00b8eb26
|
Check '--model-dir' for no models warning (#4067)
|
2023-09-26 10:56:57 -03:00 |
|
oobabooga
|
0c89180966
|
Another minor fix
|
2023-09-26 06:54:21 -07:00 |
|
oobabooga
|
365335e1ae
|
Minor fix
|
2023-09-26 06:47:19 -07:00 |
|
oobabooga
|
1ca54faaf0
|
Improve --multi-user mode
|
2023-09-26 06:42:33 -07:00 |
|
oobabooga
|
019371c0b6
|
Lint
|
2023-09-25 20:31:11 -07:00 |
|
oobabooga
|
814520fed1
|
Extension install improvements
|
2023-09-25 20:27:06 -07:00 |
|
oobabooga
|
7f1460af29
|
Change a warning
|
2023-09-25 20:22:27 -07:00 |
|
oobabooga
|
862b45b1c7
|
Extension install improvements
|
2023-09-25 19:48:30 -07:00 |
|
oobabooga
|
44438c60e5
|
Add INSTALL_EXTENSIONS environment variable
|
2023-09-25 13:12:35 -07:00 |
|
oobabooga
|
31f2815a04
|
Update Generation-Parameters.md
|
2023-09-25 16:30:52 -03:00 |
|
oobabooga
|
c8952cce55
|
Move documentation from UI to docs/
|
2023-09-25 12:28:28 -07:00 |
|
oobabooga
|
d0d221df49
|
Add --use_fast option (closes #3741)
|
2023-09-25 12:19:43 -07:00 |
|
oobabooga
|
b973b91d73
|
Automatically filter by loader (closes #4072)
|
2023-09-25 10:28:35 -07:00 |
|
oobabooga
|
63de9eb24f
|
Clean up the transformers loader
|
2023-09-24 20:26:26 -07:00 |
|
oobabooga
|
36c38d7561
|
Add disable_exllama to Transformers loader (for GPTQ LoRA training)
|
2023-09-24 20:03:11 -07:00 |
|
jllllll
|
c0fca23cb9
|
Avoid importing torch in one-click-installer (#4064)
|
2023-09-24 22:16:59 -03:00 |
|
oobabooga
|
55a685d999
|
Minor fixes
|
2023-09-24 14:15:10 -07:00 |
|
oobabooga
|
08cf150c0c
|
Add a grammar editor to the UI (#4061)
|
2023-09-24 18:05:24 -03:00 |
|
oobabooga
|
08c4fb12ae
|
Use bitsandbytes==0.38.1 for AMD
|
2023-09-24 08:11:59 -07:00 |
|
oobabooga
|
d5952cb540
|
Don't assume that py-cpuinfo is installed
|
2023-09-24 08:10:45 -07:00 |
|
oobabooga
|
eb0b7c1053
|
Fix a minor UI bug
|
2023-09-24 07:17:33 -07:00 |
|
oobabooga
|
3edac43426
|
Remove print statement
|
2023-09-24 07:13:00 -07:00 |
|
oobabooga
|
b227e65d86
|
Add grammar to llama.cpp loader (closes #4019)
|
2023-09-24 07:10:45 -07:00 |
|
oobabooga
|
a3ad9fe6c0
|
Add comments
|
2023-09-24 06:08:39 -07:00 |
|
oobabooga
|
2e7b6b0014
|
Create alternative requirements.txt with AMD and Metal wheels (#4052)
|
2023-09-24 09:58:29 -03:00 |
|
Chenxiao Wang
|
9de2dfa887
|
extensions/openai: Fix error when preparing cache for embedding models (#3995)
|
2023-09-24 00:58:28 -03:00 |
|
oobabooga
|
7a3ca2c68f
|
Better detect EXL2 models
|
2023-09-23 13:05:55 -07:00 |
|
oobabooga
|
895ec9dadb
|
Update README.md
|
2023-09-23 15:37:39 -03:00 |
|