oobabooga
|
3209440b7c
|
Rearrange chat buttons
|
2023-05-30 00:17:31 -03:00 |
|
Luis Lopez
|
9e7204bef4
|
Add tail-free and top-a sampling (#2357)
|
2023-05-29 21:40:01 -03:00 |
|
oobabooga
|
1394f44e14
|
Add triton checkbox for AutoGPTQ
|
2023-05-29 15:32:45 -03:00 |
|
Honkware
|
204731952a
|
Falcon support (trust-remote-code and autogptq checkboxes) (#2367)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2023-05-29 10:20:18 -03:00 |
|
oobabooga
|
f27135bdd3
|
Add Eta Sampling preset
Also remove some presets that I do not consider relevant
|
2023-05-28 22:44:35 -03:00 |
|
oobabooga
|
00ebea0b2a
|
Use YAML for presets and settings
|
2023-05-28 22:34:12 -03:00 |
|
oobabooga
|
fc33216477
|
Small fix for n_ctx in llama.cpp
|
2023-05-25 13:55:51 -03:00 |
|
oobabooga
|
37d4ad012b
|
Add a button for rendering markdown for any model
|
2023-05-25 11:59:27 -03:00 |
|
DGdev91
|
cf088566f8
|
Make llama.cpp read prompt size and seed from settings (#2299)
|
2023-05-25 10:29:31 -03:00 |
|
oobabooga
|
361451ba60
|
Add --load-in-4bit parameter (#2320)
|
2023-05-25 01:14:13 -03:00 |
|
Gabriel Terrien
|
fc116711b0
|
FIX save_model_settings function to also update shared.model_config (#2282)
|
2023-05-24 10:01:07 -03:00 |
|
flurb18
|
d37a28730d
|
Beginning of multi-user support (#2262)
Adds a lock to generate_reply
|
2023-05-24 09:38:20 -03:00 |
|
Gabriel Terrien
|
7aed53559a
|
Support of the --gradio-auth flag (#2283)
|
2023-05-23 20:39:26 -03:00 |
|
oobabooga
|
8b9ba3d7b4
|
Fix a typo
|
2023-05-22 20:13:03 -03:00 |
|
Gabriel Terrien
|
0f51b64bb3
|
Add a "dark_theme" option to settings.json (#2288)
|
2023-05-22 19:45:11 -03:00 |
|
oobabooga
|
c5446ae0e2
|
Fix a link
|
2023-05-22 19:38:34 -03:00 |
|
oobabooga
|
c0fd7f3257
|
Add mirostat parameters for llama.cpp (#2287)
|
2023-05-22 19:37:24 -03:00 |
|
oobabooga
|
ec7437f00a
|
Better way to toggle light/dark mode
|
2023-05-22 03:19:01 -03:00 |
|
oobabooga
|
d46f5a58a3
|
Add a button for toggling dark/light mode
|
2023-05-22 03:11:44 -03:00 |
|
oobabooga
|
753f6c5250
|
Attempt at making interface restart more robust
|
2023-05-22 00:26:07 -03:00 |
|
oobabooga
|
30225b9dd0
|
Fix --no-stream queue bug
|
2023-05-22 00:02:59 -03:00 |
|
oobabooga
|
288912baf1
|
Add a description for the extensions checkbox group
|
2023-05-21 23:33:37 -03:00 |
|
oobabooga
|
6e77844733
|
Add a description for penalty_alpha
|
2023-05-21 23:09:30 -03:00 |
|
oobabooga
|
e3d578502a
|
Improve "Chat settings" tab appearance a bit
|
2023-05-21 22:58:14 -03:00 |
|
oobabooga
|
e116d31180
|
Prevent unwanted log messages from modules
|
2023-05-21 22:42:34 -03:00 |
|
oobabooga
|
d7fabe693d
|
Reorganize parameters tab
|
2023-05-21 16:24:47 -03:00 |
|
oobabooga
|
8ac3636966
|
Add epsilon_cutoff/eta_cutoff parameters (#2258)
|
2023-05-21 15:11:57 -03:00 |
|
Matthew McAllister
|
ab6acddcc5
|
Add Save/Delete character buttons (#1870)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2023-05-20 21:48:45 -03:00 |
|
HappyWorldGames
|
a3e9769e31
|
Added an audible notification after text generation in web. (#1277)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2023-05-19 23:16:06 -03:00 |
|
oobabooga
|
f052ab9c8f
|
Fix setting pre_layer from within the ui
|
2023-05-17 23:17:44 -03:00 |
|
oobabooga
|
fd743a0207
|
Small change
|
2023-05-17 02:34:29 -03:00 |
|
LoopLooter
|
aeb1b7a9c5
|
feature to save prompts with custom names (#1583)
---------
Co-authored-by: LoopLooter <looplooter>
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2023-05-17 02:30:45 -03:00 |
|
oobabooga
|
85f74961f9
|
Update "Interface mode" tab
|
2023-05-17 01:57:51 -03:00 |
|
oobabooga
|
ce21804ec7
|
Allow extensions to define a new tab
|
2023-05-17 01:31:56 -03:00 |
|
oobabooga
|
a84f499718
|
Allow extensions to define custom CSS and JS
|
2023-05-17 00:30:54 -03:00 |
|
oobabooga
|
824fa8fc0e
|
Attempt at making interface restart more robust
|
2023-05-16 22:27:43 -03:00 |
|
oobabooga
|
7584d46c29
|
Refactor models.py (#2113)
|
2023-05-16 19:52:22 -03:00 |
|
oobabooga
|
5cd6dd4287
|
Fix no-mmap bug
|
2023-05-16 17:35:49 -03:00 |
|
oobabooga
|
89e37626ab
|
Reorganize chat settings tab
|
2023-05-16 17:22:59 -03:00 |
|
Jakub Strnad
|
0227e738ed
|
Add settings UI for llama.cpp and fixed reloading of llama.cpp models (#2087)
|
2023-05-15 19:51:23 -03:00 |
|
oobabooga
|
3b886f9c9f
|
Add chat-instruct mode (#2049)
|
2023-05-14 10:43:55 -03:00 |
|
oobabooga
|
437d1c7ead
|
Fix bug in save_model_settings
|
2023-05-12 14:33:00 -03:00 |
|
oobabooga
|
146a9cb393
|
Allow superbooga to download URLs in parallel
|
2023-05-12 14:19:55 -03:00 |
|
oobabooga
|
e283ddc559
|
Change how spaces are handled in continue/generation attempts
|
2023-05-12 12:50:29 -03:00 |
|
oobabooga
|
5eaa914e1b
|
Fix settings.json being ignored because of config.yaml
|
2023-05-12 06:09:45 -03:00 |
|
oobabooga
|
a77965e801
|
Make the regex for "Save settings for this model" exact
|
2023-05-12 00:43:13 -03:00 |
|
oobabooga
|
f7dbddfff5
|
Add a variable for tts extensions to use
|
2023-05-11 16:12:46 -03:00 |
|
oobabooga
|
638c6a65a2
|
Refactor chat functions (#2003)
|
2023-05-11 15:37:04 -03:00 |
|
oobabooga
|
e5b1547849
|
Fix reload model button
|
2023-05-10 14:44:25 -03:00 |
|
oobabooga
|
3316e33d14
|
Remove unused code
|
2023-05-10 11:59:59 -03:00 |
|