oobabooga
|
644a9b8765
|
Change the chat generate button
|
2023-09-14 05:16:44 -07:00 |
|
oobabooga
|
ecc90f9f62
|
Continue on Alt + Enter
|
2023-09-14 03:59:12 -07:00 |
|
oobabooga
|
1ce3c93600
|
Allow "Your name" field to be saved
|
2023-09-14 03:44:35 -07:00 |
|
oobabooga
|
27dbcc59f5
|
Make the chat input expand upwards (#3920)
|
2023-09-14 07:06:42 -03:00 |
|
oobabooga
|
6b6af74e14
|
Keyboard shortcuts without conflicts (hopefully)
|
2023-09-14 02:33:52 -07:00 |
|
oobabooga
|
fc11d1eff0
|
Add chat keyboard shortcuts
|
2023-09-13 19:22:40 -07:00 |
|
oobabooga
|
9f199c7a4c
|
Use Noto Sans font
Copied from 6c8bd06308/public/webfonts/NotoSans
|
2023-09-13 13:48:05 -07:00 |
|
oobabooga
|
8ce94b735c
|
Show progress on impersonate
|
2023-09-13 11:22:53 -07:00 |
|
oobabooga
|
7cd437e05c
|
Properly close the hover menu on mobile
|
2023-09-13 11:10:46 -07:00 |
|
oobabooga
|
1b47b5c676
|
Change the Generate/Stop buttons
|
2023-09-13 09:25:26 -07:00 |
|
oobabooga
|
8ea28cbfe0
|
Reorder chat buttons
|
2023-09-13 08:49:11 -07:00 |
|
oobabooga
|
5e3d2f7d44
|
Reorganize chat buttons (#3892)
|
2023-09-13 02:36:12 -03:00 |
|
Panchovix
|
34dc7306b8
|
Fix NTK (alpha) and RoPE scaling for exllamav2 and exllamav2_HF (#3897)
|
2023-09-13 02:35:09 -03:00 |
|
oobabooga
|
b7adf290fc
|
Fix ExLlama-v2 path issue
|
2023-09-12 17:42:22 -07:00 |
|
oobabooga
|
b190676893
|
Merge remote-tracking branch 'refs/remotes/origin/main'
|
2023-09-12 15:06:33 -07:00 |
|
oobabooga
|
2f935547c8
|
Minor changes
|
2023-09-12 15:05:21 -07:00 |
|
oobabooga
|
18e6b275f3
|
Add alpha_value/compress_pos_emb to ExLlama-v2
|
2023-09-12 15:02:47 -07:00 |
|
Gennadij
|
460c40d8ab
|
Read more GGUF metadata (scale_linear and freq_base) (#3877)
|
2023-09-12 17:02:42 -03:00 |
|
oobabooga
|
16e1696071
|
Minor qol change
|
2023-09-12 10:44:26 -07:00 |
|
oobabooga
|
c2a309f56e
|
Add ExLlamaV2 and ExLlamav2_HF loaders (#3881)
|
2023-09-12 14:33:07 -03:00 |
|
oobabooga
|
df123a20fc
|
Prevent extra keys from being saved to settings.yaml
|
2023-09-11 20:13:10 -07:00 |
|
oobabooga
|
dae428a967
|
Revamp cai-chat theme, make it default
|
2023-09-11 19:30:40 -07:00 |
|
oobabooga
|
78811dd89a
|
Fix GGUF metadata reading for falcon
|
2023-09-11 15:49:50 -07:00 |
|
oobabooga
|
9331ab4798
|
Read GGUF metadata (#3873)
|
2023-09-11 18:49:30 -03:00 |
|
oobabooga
|
df52dab67b
|
Lint
|
2023-09-11 07:57:38 -07:00 |
|
oobabooga
|
ed86878f02
|
Remove GGML support
|
2023-09-11 07:44:00 -07:00 |
|
John Smith
|
cc7b7ba153
|
fix lora training with alpaca_lora_4bit (#3853)
|
2023-09-11 01:22:20 -03:00 |
|
Forkoz
|
15e9b8c915
|
Exllama new rope settings (#3852)
|
2023-09-11 01:14:36 -03:00 |
|
oobabooga
|
4affa08821
|
Do not impose instruct mode while loading models
|
2023-09-02 11:31:33 -07:00 |
|
oobabooga
|
47e490c7b4
|
Set use_cache=True by default for all models
|
2023-08-30 13:26:27 -07:00 |
|
missionfloyd
|
787219267c
|
Allow downloading single file from UI (#3737)
|
2023-08-29 23:32:36 -03:00 |
|
oobabooga
|
cec8db52e5
|
Add max_tokens_second param (#3533)
|
2023-08-29 17:44:31 -03:00 |
|
oobabooga
|
2b58a89f6a
|
Clear instruction template before loading new one
|
2023-08-29 13:11:32 -07:00 |
|
oobabooga
|
36864cb3e8
|
Use Alpaca as the default instruction template
|
2023-08-29 13:06:25 -07:00 |
|
oobabooga
|
9a202f7fb2
|
Prevent <ul> lists from flickering during streaming
|
2023-08-28 20:45:07 -07:00 |
|
oobabooga
|
439dd0faab
|
Fix stopping strings in the chat API
|
2023-08-28 19:40:11 -07:00 |
|
oobabooga
|
c75f98a6d6
|
Autoscroll Notebook/Default textareas during streaming
|
2023-08-28 18:22:03 -07:00 |
|
oobabooga
|
558e918fd6
|
Add a typing dots (...) animation to chat tab
|
2023-08-28 13:50:36 -07:00 |
|
oobabooga
|
57e9ded00c
|
Make it possible to scroll during streaming (#3721)
|
2023-08-28 16:03:20 -03:00 |
|
Cebtenzzre
|
2f5d769a8d
|
accept floating-point alpha value on the command line (#3712)
|
2023-08-27 18:54:43 -03:00 |
|
oobabooga
|
b2296dcda0
|
Ctrl+S to show/hide chat controls
|
2023-08-27 13:14:33 -07:00 |
|
Ravindra Marella
|
e4c3e1bdd2
|
Fix ctransformers model unload (#3711)
Add missing comma in model types list
Fixes marella/ctransformers#111
|
2023-08-27 10:53:48 -03:00 |
|
oobabooga
|
0c9e818bb8
|
Update truncation length based on max_seq_len/n_ctx
|
2023-08-26 23:10:45 -07:00 |
|
oobabooga
|
3361728da1
|
Change some comments
|
2023-08-26 22:24:44 -07:00 |
|
oobabooga
|
8aeae3b3f4
|
Fix llamacpp_HF loading
|
2023-08-26 22:15:06 -07:00 |
|
oobabooga
|
7f5370a272
|
Minor fixes/cosmetics
|
2023-08-26 22:11:07 -07:00 |
|
jllllll
|
4d61a7d9da
|
Account for deprecated GGML parameters
|
2023-08-26 14:07:46 -05:00 |
|
jllllll
|
4a999e3bcd
|
Use separate llama-cpp-python packages for GGML support
|
2023-08-26 10:40:08 -05:00 |
|
oobabooga
|
83640d6f43
|
Replace ggml occurences with gguf
|
2023-08-26 01:06:59 -07:00 |
|
jllllll
|
db42b365c9
|
Fix ctransformers threads auto-detection (#3688)
|
2023-08-25 14:37:02 -03:00 |
|