Commit Graph

1033 Commits

Author SHA1 Message Date
oobabooga
abe99cddeb Extend evaluation slider bounds 2023-09-29 13:06:26 -07:00
oobabooga
96da2e1c0d Read more metadata (config.json & quantize_config.json) 2023-09-29 06:14:16 -07:00
oobabooga
56b5a4af74 exllamav2 typical_p 2023-09-28 20:10:12 -07:00
oobabooga
f8e9733412 Minor syntax change 2023-09-28 19:32:35 -07:00
oobabooga
f931184b53 Increase truncation limits to 32768 2023-09-28 19:28:22 -07:00
oobabooga
1dd13e4643 Read Transformers config.json metadata 2023-09-28 19:19:47 -07:00
StoyanStAtanasov
7e6ff8d1f0
Enable NUMA feature for llama_cpp_python (#4040) 2023-09-26 22:05:00 -03:00
oobabooga
87ea2d96fd Add a note about RWKV loader 2023-09-26 17:43:39 -07:00
oobabooga
0c89180966 Another minor fix 2023-09-26 06:54:21 -07:00
oobabooga
365335e1ae Minor fix 2023-09-26 06:47:19 -07:00
oobabooga
1ca54faaf0 Improve --multi-user mode 2023-09-26 06:42:33 -07:00
oobabooga
019371c0b6 Lint 2023-09-25 20:31:11 -07:00
oobabooga
814520fed1 Extension install improvements 2023-09-25 20:27:06 -07:00
oobabooga
7f1460af29 Change a warning 2023-09-25 20:22:27 -07:00
oobabooga
862b45b1c7 Extension install improvements 2023-09-25 19:48:30 -07:00
oobabooga
c8952cce55 Move documentation from UI to docs/ 2023-09-25 12:28:28 -07:00
oobabooga
d0d221df49 Add --use_fast option (closes #3741) 2023-09-25 12:19:43 -07:00
oobabooga
b973b91d73 Automatically filter by loader (closes #4072) 2023-09-25 10:28:35 -07:00
oobabooga
63de9eb24f Clean up the transformers loader 2023-09-24 20:26:26 -07:00
oobabooga
36c38d7561 Add disable_exllama to Transformers loader (for GPTQ LoRA training) 2023-09-24 20:03:11 -07:00
oobabooga
55a685d999 Minor fixes 2023-09-24 14:15:10 -07:00
oobabooga
08cf150c0c
Add a grammar editor to the UI (#4061) 2023-09-24 18:05:24 -03:00
oobabooga
eb0b7c1053 Fix a minor UI bug 2023-09-24 07:17:33 -07:00
oobabooga
3edac43426 Remove print statement 2023-09-24 07:13:00 -07:00
oobabooga
b227e65d86 Add grammar to llama.cpp loader (closes #4019) 2023-09-24 07:10:45 -07:00
oobabooga
2e7b6b0014
Create alternative requirements.txt with AMD and Metal wheels (#4052) 2023-09-24 09:58:29 -03:00
oobabooga
7a3ca2c68f Better detect EXL2 models 2023-09-23 13:05:55 -07:00
oobabooga
b1467bd064
Move one-click-installers into the repository (#4028 from oobabooga/one-click) 2023-09-22 17:43:07 -03:00
oobabooga
c075969875 Add instructions 2023-09-22 13:10:03 -07:00
oobabooga
8ab3eca9ec Add a warning for outdated installations 2023-09-22 09:35:19 -07:00
oobabooga
95976a9d4f Fix a bug while deleting characters 2023-09-22 06:02:34 -07:00
oobabooga
d5330406fa Add a rename menu for chat histories 2023-09-21 19:16:51 -07:00
oobabooga
00ab450c13
Multiple histories for each character (#4022) 2023-09-21 17:19:32 -03:00
oobabooga
029da9563f Avoid redundant function call in llamacpp_hf 2023-09-19 14:14:40 -07:00
oobabooga
869f47fff9 Lint 2023-09-19 13:51:57 -07:00
oobabooga
13ac55fa18 Reorder some functions 2023-09-19 13:51:57 -07:00
oobabooga
03dc69edc5 ExLlama_HF (v1 and v2) prefix matching 2023-09-19 13:12:19 -07:00
oobabooga
5075087461 Fix command-line arguments being ignored 2023-09-19 13:11:46 -07:00
oobabooga
ff5d3d2d09 Add missing import 2023-09-18 16:26:54 -07:00
oobabooga
605ec3c9f2 Add a warning about ExLlamaV2 without flash-attn 2023-09-18 12:26:35 -07:00
oobabooga
f0ef971edb Remove obsolete warning 2023-09-18 12:25:10 -07:00
oobabooga
745807dc03 Faster llamacpp_HF prefix matching 2023-09-18 11:02:45 -07:00
BadisG
893a72a1c5
Stop generation immediately when using "Maximum tokens/second" (#3952)
---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-09-18 14:27:06 -03:00
Cebtenzzre
8466cf229a
llama.cpp: fix ban_eos_token (#3987) 2023-09-18 12:15:02 -03:00
oobabooga
0ede2965d5 Remove an error message 2023-09-17 18:46:08 -07:00
missionfloyd
cc8eda298a
Move hover menu shortcuts to right side (#3951) 2023-09-17 22:33:00 -03:00
oobabooga
280cca9f66 Merge remote-tracking branch 'refs/remotes/origin/main' 2023-09-17 18:01:27 -07:00
oobabooga
b062d50c45 Remove exllama import that causes problems 2023-09-17 18:00:32 -07:00
James Braza
fee38e0601
Simplified ExLlama cloning instructions and failure message (#3972) 2023-09-17 19:26:05 -03:00
Lu Guanghua
9858acee7b
Fix unexpected extensions load after gradio restart (#3965) 2023-09-17 17:35:43 -03:00
oobabooga
d9b0f2c9c3 Fix llama.cpp double decoding 2023-09-17 13:07:48 -07:00
oobabooga
d71465708c llamacpp_HF prefix matching 2023-09-17 11:51:01 -07:00
oobabooga
37e2980e05 Recommend mul_mat_q for llama.cpp 2023-09-17 08:27:11 -07:00
oobabooga
a069f3904c Undo part of ad8ac545a5 2023-09-17 08:12:23 -07:00
oobabooga
ad8ac545a5 Tokenization improvements 2023-09-17 07:02:00 -07:00
saltacc
cd08eb0753
token probs for non HF loaders (#3957) 2023-09-17 10:42:32 -03:00
kalomaze
7c9664ed35
Allow full model URL to be used for download (#3919)
---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-09-16 10:06:13 -03:00
saltacc
ed6b6411fb
Fix exllama tokenizers (#3954)
---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-09-16 09:42:38 -03:00
missionfloyd
2ad6ca8874
Add back chat buttons with --chat-buttons (#3947) 2023-09-16 00:39:37 -03:00
oobabooga
ef04138bc0 Improve the UI tokenizer 2023-09-15 19:30:44 -07:00
oobabooga
c3e4c9fdc2 Add a simple tokenizer to the UI 2023-09-15 19:09:03 -07:00
saltacc
f01b9aa71f
Add customizable ban tokens (#3899) 2023-09-15 18:27:27 -03:00
oobabooga
5b117590ad Add some scrollbars to Parameters tab 2023-09-15 09:17:37 -07:00
Johan
fdcee0c215
Allow custom tokenizer for llamacpp_HF loader (#3941) 2023-09-15 12:38:38 -03:00
oobabooga
fd7257c7f8 Prevent code blocks from flickering while streaming 2023-09-15 07:46:26 -07:00
oobabooga
a3ecf3bb65 Add cai-chat-square chat style 2023-09-14 16:15:08 -07:00
oobabooga
3d1c0f173d User config precedence over GGUF metadata 2023-09-14 12:15:52 -07:00
oobabooga
94dc64f870 Add a border 2023-09-14 07:20:36 -07:00
oobabooga
70aafa34dc Fix blockquote markdown rendering 2023-09-14 05:57:04 -07:00
oobabooga
644a9b8765 Change the chat generate button 2023-09-14 05:16:44 -07:00
oobabooga
ecc90f9f62 Continue on Alt + Enter 2023-09-14 03:59:12 -07:00
oobabooga
1ce3c93600 Allow "Your name" field to be saved 2023-09-14 03:44:35 -07:00
oobabooga
27dbcc59f5
Make the chat input expand upwards (#3920) 2023-09-14 07:06:42 -03:00
oobabooga
6b6af74e14 Keyboard shortcuts without conflicts (hopefully) 2023-09-14 02:33:52 -07:00
oobabooga
fc11d1eff0 Add chat keyboard shortcuts 2023-09-13 19:22:40 -07:00
oobabooga
9f199c7a4c Use Noto Sans font
Copied from 6c8bd06308/public/webfonts/NotoSans
2023-09-13 13:48:05 -07:00
oobabooga
8ce94b735c Show progress on impersonate 2023-09-13 11:22:53 -07:00
oobabooga
7cd437e05c Properly close the hover menu on mobile 2023-09-13 11:10:46 -07:00
oobabooga
1b47b5c676 Change the Generate/Stop buttons 2023-09-13 09:25:26 -07:00
oobabooga
8ea28cbfe0 Reorder chat buttons 2023-09-13 08:49:11 -07:00
oobabooga
5e3d2f7d44
Reorganize chat buttons (#3892) 2023-09-13 02:36:12 -03:00
Panchovix
34dc7306b8
Fix NTK (alpha) and RoPE scaling for exllamav2 and exllamav2_HF (#3897) 2023-09-13 02:35:09 -03:00
oobabooga
b7adf290fc Fix ExLlama-v2 path issue 2023-09-12 17:42:22 -07:00
oobabooga
b190676893 Merge remote-tracking branch 'refs/remotes/origin/main' 2023-09-12 15:06:33 -07:00
oobabooga
2f935547c8 Minor changes 2023-09-12 15:05:21 -07:00
oobabooga
18e6b275f3 Add alpha_value/compress_pos_emb to ExLlama-v2 2023-09-12 15:02:47 -07:00
Gennadij
460c40d8ab
Read more GGUF metadata (scale_linear and freq_base) (#3877) 2023-09-12 17:02:42 -03:00
oobabooga
16e1696071 Minor qol change 2023-09-12 10:44:26 -07:00
oobabooga
c2a309f56e
Add ExLlamaV2 and ExLlamav2_HF loaders (#3881) 2023-09-12 14:33:07 -03:00
oobabooga
df123a20fc Prevent extra keys from being saved to settings.yaml 2023-09-11 20:13:10 -07:00
oobabooga
dae428a967 Revamp cai-chat theme, make it default 2023-09-11 19:30:40 -07:00
oobabooga
78811dd89a Fix GGUF metadata reading for falcon 2023-09-11 15:49:50 -07:00
oobabooga
9331ab4798
Read GGUF metadata (#3873) 2023-09-11 18:49:30 -03:00
oobabooga
df52dab67b Lint 2023-09-11 07:57:38 -07:00
oobabooga
ed86878f02 Remove GGML support 2023-09-11 07:44:00 -07:00
John Smith
cc7b7ba153
fix lora training with alpaca_lora_4bit (#3853) 2023-09-11 01:22:20 -03:00
Forkoz
15e9b8c915
Exllama new rope settings (#3852) 2023-09-11 01:14:36 -03:00
oobabooga
4affa08821 Do not impose instruct mode while loading models 2023-09-02 11:31:33 -07:00
oobabooga
47e490c7b4 Set use_cache=True by default for all models 2023-08-30 13:26:27 -07:00
missionfloyd
787219267c
Allow downloading single file from UI (#3737) 2023-08-29 23:32:36 -03:00