oobabooga
|
7ffb424c7b
|
Add AutoAWQ to README
|
2023-10-05 09:22:37 -07:00 |
|
cal066
|
cc632c3f33
|
AutoAWQ: initial support (#3999)
|
2023-10-05 13:19:18 -03:00 |
|
oobabooga
|
3f56151f03
|
Bump to transformers 4.34
|
2023-10-05 08:55:14 -07:00 |
|
tdrussell
|
cb26163a20
|
Fix off-by-one error in exllama_hf caching logic (#4145)
|
2023-10-05 12:20:56 -03:00 |
|
Gennadij
|
b04c08378d
|
Add CMD_FLAGS.txt to .gitignore (#4181)
|
2023-10-05 10:02:38 -03:00 |
|
oobabooga
|
ae4ba3007f
|
Add grammar to transformers and _HF loaders (#4091)
|
2023-10-05 10:01:36 -03:00 |
|
oobabooga
|
0197fdddf1
|
Merge pull request #4142 from jllllll/llamacpp-0.2.11
Bump llama-cpp-python to 0.2.11
|
2023-10-02 01:31:14 -03:00 |
|
oobabooga
|
b6fe6acf88
|
Add threads_batch parameter
|
2023-10-01 21:28:00 -07:00 |
|
jllllll
|
41a2de96e5
|
Bump llama-cpp-python to 0.2.11
|
2023-10-01 18:08:10 -05:00 |
|
oobabooga
|
f2d82f731a
|
Add recommended NTKv1 alpha values
|
2023-09-29 13:48:38 -07:00 |
|
oobabooga
|
abe99cddeb
|
Extend evaluation slider bounds
|
2023-09-29 13:06:26 -07:00 |
|
oobabooga
|
96da2e1c0d
|
Read more metadata (config.json & quantize_config.json)
|
2023-09-29 06:14:16 -07:00 |
|
oobabooga
|
56b5a4af74
|
exllamav2 typical_p
|
2023-09-28 20:10:12 -07:00 |
|
oobabooga
|
f8e9733412
|
Minor syntax change
|
2023-09-28 19:32:35 -07:00 |
|
oobabooga
|
f931184b53
|
Increase truncation limits to 32768
|
2023-09-28 19:28:22 -07:00 |
|
oobabooga
|
1dd13e4643
|
Read Transformers config.json metadata
|
2023-09-28 19:19:47 -07:00 |
|
oobabooga
|
9ccaf5eebb
|
I forgot to add the file
|
2023-09-28 18:25:58 -07:00 |
|
oobabooga
|
92a39c619b
|
Add Mistral support
|
2023-09-28 15:41:03 -07:00 |
|
oobabooga
|
f46ba12b42
|
Add flash-attn wheels for Linux
|
2023-09-28 14:45:52 -07:00 |
|
oobabooga
|
771e936769
|
Fix extensions install (2nd attempt)
|
2023-09-28 14:33:49 -07:00 |
|
快乐的我531
|
4e56ad55e1
|
Let model downloader download *.tiktoken as well (#4121)
|
2023-09-28 18:03:18 -03:00 |
|
oobabooga
|
822ba7fcbb
|
Better error handling during install/update
|
2023-09-28 13:57:59 -07:00 |
|
oobabooga
|
85f45cafa1
|
Fix extensions install
|
2023-09-28 13:54:36 -07:00 |
|
Nathan Thomas
|
e145d9a0da
|
Update one_click.py to initialize site_packages_path variable (#4118)
|
2023-09-28 08:31:29 -03:00 |
|
Chenxiao Wang
|
3fb1e0236a
|
fix: update superboogav2 requirements.txt (#4100)
|
2023-09-27 23:45:59 -03:00 |
|
jllllll
|
2bd23c29cb
|
Bump llama-cpp-python to 0.2.7 (#4110)
|
2023-09-27 23:45:36 -03:00 |
|
missionfloyd
|
86e7c05429
|
Delete extensions/Training_PRO/readme.md (#4112)
|
2023-09-27 23:45:13 -03:00 |
|
Sam
|
a0d99dcf90
|
fix: add missing superboogav2 dep (#4099)
|
2023-09-26 23:37:22 -03:00 |
|
StoyanStAtanasov
|
7e6ff8d1f0
|
Enable NUMA feature for llama_cpp_python (#4040)
|
2023-09-26 22:05:00 -03:00 |
|
oobabooga
|
87ea2d96fd
|
Add a note about RWKV loader
|
2023-09-26 17:43:39 -07:00 |
|
jllllll
|
13a54729b1
|
Bump exllamav2 to 0.0.4 and use pre-built wheels (#4095)
|
2023-09-26 21:36:14 -03:00 |
|
jllllll
|
3879ab5007
|
Expand MacOS llama.cpp support in requirements (#4094)
Provides MacOS 12 and 13 wheels.
|
2023-09-26 21:34:48 -03:00 |
|
jllllll
|
9d9aa38234
|
Fix old install migration for WSL installer (#4093)
|
2023-09-26 21:34:16 -03:00 |
|
HideLord
|
0845724a89
|
Supercharging superbooga (#3272)
|
2023-09-26 21:30:19 -03:00 |
|
jllllll
|
ad00b8eb26
|
Check '--model-dir' for no models warning (#4067)
|
2023-09-26 10:56:57 -03:00 |
|
oobabooga
|
0c89180966
|
Another minor fix
|
2023-09-26 06:54:21 -07:00 |
|
oobabooga
|
365335e1ae
|
Minor fix
|
2023-09-26 06:47:19 -07:00 |
|
oobabooga
|
1ca54faaf0
|
Improve --multi-user mode
|
2023-09-26 06:42:33 -07:00 |
|
oobabooga
|
019371c0b6
|
Lint
|
2023-09-25 20:31:11 -07:00 |
|
oobabooga
|
814520fed1
|
Extension install improvements
|
2023-09-25 20:27:06 -07:00 |
|
oobabooga
|
7f1460af29
|
Change a warning
|
2023-09-25 20:22:27 -07:00 |
|
oobabooga
|
862b45b1c7
|
Extension install improvements
|
2023-09-25 19:48:30 -07:00 |
|
oobabooga
|
44438c60e5
|
Add INSTALL_EXTENSIONS environment variable
|
2023-09-25 13:12:35 -07:00 |
|
oobabooga
|
31f2815a04
|
Update Generation-Parameters.md
|
2023-09-25 16:30:52 -03:00 |
|
oobabooga
|
c8952cce55
|
Move documentation from UI to docs/
|
2023-09-25 12:28:28 -07:00 |
|
oobabooga
|
d0d221df49
|
Add --use_fast option (closes #3741)
|
2023-09-25 12:19:43 -07:00 |
|
oobabooga
|
b973b91d73
|
Automatically filter by loader (closes #4072)
|
2023-09-25 10:28:35 -07:00 |
|
oobabooga
|
63de9eb24f
|
Clean up the transformers loader
|
2023-09-24 20:26:26 -07:00 |
|
oobabooga
|
36c38d7561
|
Add disable_exllama to Transformers loader (for GPTQ LoRA training)
|
2023-09-24 20:03:11 -07:00 |
|
jllllll
|
c0fca23cb9
|
Avoid importing torch in one-click-installer (#4064)
|
2023-09-24 22:16:59 -03:00 |
|