oobabooga
|
13ac55fa18
|
Reorder some functions
|
2023-09-19 13:51:57 -07:00 |
|
oobabooga
|
e2fddd9584
|
More robust autoscrolling (attempt)
|
2023-09-19 13:12:34 -07:00 |
|
oobabooga
|
03dc69edc5
|
ExLlama_HF (v1 and v2) prefix matching
|
2023-09-19 13:12:19 -07:00 |
|
oobabooga
|
5075087461
|
Fix command-line arguments being ignored
|
2023-09-19 13:11:46 -07:00 |
|
oobabooga
|
ff5d3d2d09
|
Add missing import
|
2023-09-18 16:26:54 -07:00 |
|
oobabooga
|
605ec3c9f2
|
Add a warning about ExLlamaV2 without flash-attn
|
2023-09-18 12:26:35 -07:00 |
|
oobabooga
|
f0ef971edb
|
Remove obsolete warning
|
2023-09-18 12:25:10 -07:00 |
|
oobabooga
|
745807dc03
|
Faster llamacpp_HF prefix matching
|
2023-09-18 11:02:45 -07:00 |
|
BadisG
|
893a72a1c5
|
Stop generation immediately when using "Maximum tokens/second" (#3952)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2023-09-18 14:27:06 -03:00 |
|
jllllll
|
b7c55665c1
|
Bump llama-cpp-python to 0.2.6 (#3982)
|
2023-09-18 14:08:37 -03:00 |
|
Cebtenzzre
|
8466cf229a
|
llama.cpp: fix ban_eos_token (#3987)
|
2023-09-18 12:15:02 -03:00 |
|
oobabooga
|
0ede2965d5
|
Remove an error message
|
2023-09-17 18:46:08 -07:00 |
|
dependabot[bot]
|
661bfaac8e
|
Update accelerate from ==0.22.* to ==0.23.* (#3981)
|
2023-09-17 22:42:12 -03:00 |
|
Chenxiao Wang
|
347aed4254
|
extensions/openai: load extension settings via settings.yaml (#3953)
|
2023-09-17 22:39:29 -03:00 |
|
missionfloyd
|
cc8eda298a
|
Move hover menu shortcuts to right side (#3951)
|
2023-09-17 22:33:00 -03:00 |
|
oobabooga
|
280cca9f66
|
Merge remote-tracking branch 'refs/remotes/origin/main'
|
2023-09-17 18:01:27 -07:00 |
|
oobabooga
|
b062d50c45
|
Remove exllama import that causes problems
|
2023-09-17 18:00:32 -07:00 |
|
James Braza
|
fee38e0601
|
Simplified ExLlama cloning instructions and failure message (#3972)
|
2023-09-17 19:26:05 -03:00 |
|
Thireus ☠
|
45335fa8f4
|
Bump ExLlamav2 to v0.0.2 (#3970)
|
2023-09-17 19:24:40 -03:00 |
|
Lu Guanghua
|
9858acee7b
|
Fix unexpected extensions load after gradio restart (#3965)
|
2023-09-17 17:35:43 -03:00 |
|
oobabooga
|
d9b0f2c9c3
|
Fix llama.cpp double decoding
|
2023-09-17 13:07:48 -07:00 |
|
FartyPants
|
230b562d53
|
Training_PRO extension - added target selector (#3969)
|
2023-09-17 17:00:00 -03:00 |
|
oobabooga
|
d71465708c
|
llamacpp_HF prefix matching
|
2023-09-17 11:51:01 -07:00 |
|
oobabooga
|
763ea3bcb2
|
Improved multimodal error message
|
2023-09-17 09:22:16 -07:00 |
|
oobabooga
|
37e2980e05
|
Recommend mul_mat_q for llama.cpp
|
2023-09-17 08:27:11 -07:00 |
|
oobabooga
|
a069f3904c
|
Undo part of ad8ac545a5
|
2023-09-17 08:12:23 -07:00 |
|
FartyPants
|
e34c6e6938
|
Training PRO extension (#3961)
|
2023-09-17 11:09:31 -03:00 |
|
oobabooga
|
ad8ac545a5
|
Tokenization improvements
|
2023-09-17 07:02:00 -07:00 |
|
saltacc
|
cd08eb0753
|
token probs for non HF loaders (#3957)
|
2023-09-17 10:42:32 -03:00 |
|
Shulzhenko Anatolii
|
0668f4e67f
|
Add speechrecognition dependency for OpenAI extension (#3959)
|
2023-09-16 13:49:48 -03:00 |
|
wizd
|
cc7f345c29
|
add whisper api to openai plugin (#3958)
|
2023-09-16 12:04:04 -03:00 |
|
Lu Guanghua
|
cd534ba46e
|
Fix Google Translate escaping (#3827)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2023-09-16 10:18:06 -03:00 |
|
kalomaze
|
7c9664ed35
|
Allow full model URL to be used for download (#3919)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2023-09-16 10:06:13 -03:00 |
|
saltacc
|
ed6b6411fb
|
Fix exllama tokenizers (#3954)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2023-09-16 09:42:38 -03:00 |
|
oobabooga
|
8d85425e09
|
Increase --chat-buttons height
|
2023-09-15 21:21:24 -07:00 |
|
oobabooga
|
e75489c252
|
Update README
|
2023-09-15 21:04:51 -07:00 |
|
oobabooga
|
52c4fb75ff
|
Remove extra space in --chat-buttons
|
2023-09-15 20:56:30 -07:00 |
|
missionfloyd
|
2ad6ca8874
|
Add back chat buttons with --chat-buttons (#3947)
|
2023-09-16 00:39:37 -03:00 |
|
oobabooga
|
f5fb1ee666
|
Change a comment
|
2023-09-15 20:16:30 -07:00 |
|
oobabooga
|
2c1b548cea
|
Minor fix
|
2023-09-15 20:14:32 -07:00 |
|
oobabooga
|
8f97e87cac
|
Lint the openai extension
|
2023-09-15 20:11:16 -07:00 |
|
oobabooga
|
760510db52
|
Change a height
|
2023-09-15 19:41:53 -07:00 |
|
oobabooga
|
ef04138bc0
|
Improve the UI tokenizer
|
2023-09-15 19:30:44 -07:00 |
|
oobabooga
|
c3e4c9fdc2
|
Add a simple tokenizer to the UI
|
2023-09-15 19:09:03 -07:00 |
|
saltacc
|
f01b9aa71f
|
Add customizable ban tokens (#3899)
|
2023-09-15 18:27:27 -03:00 |
|
oobabooga
|
fb864dad7b
|
Update README
|
2023-09-15 13:00:46 -07:00 |
|
oobabooga
|
5b117590ad
|
Add some scrollbars to Parameters tab
|
2023-09-15 09:17:37 -07:00 |
|
oobabooga
|
a3a5ffe651
|
Adjust Default tab heights
|
2023-09-15 09:12:42 -07:00 |
|
oobabooga
|
985020f038
|
Adjust token counter height
|
2023-09-15 08:50:59 -07:00 |
|
Johan
|
fdcee0c215
|
Allow custom tokenizer for llamacpp_HF loader (#3941)
|
2023-09-15 12:38:38 -03:00 |
|