oobabooga
ac122832f7
Make dropdown menus more similar to automatic1111
2023-06-11 14:20:16 -03:00
oobabooga
6133675e0f
Add menus for saving presets/characters/instruction templates/prompts ( #2621 )
2023-06-11 12:19:18 -03:00
brandonj60
b04e18d10c
Add Mirostat v2 sampling to transformer models ( #2571 )
2023-06-09 21:26:31 -03:00
oobabooga
eb2601a8c3
Reorganize Parameters tab
2023-06-06 14:51:02 -03:00
oobabooga
f06a1387f0
Reorganize Models tab
2023-06-06 07:58:07 -03:00
oobabooga
d49d299b67
Change a message
2023-06-06 07:54:56 -03:00
oobabooga
7ed1e35fbf
Reorganize Parameters tab in chat mode
2023-06-06 07:46:25 -03:00
oobabooga
00b94847da
Remove softprompt support
2023-06-06 07:42:23 -03:00
oobabooga
f276d88546
Use AutoGPTQ by default for GPTQ models
2023-06-05 15:41:48 -03:00
oobabooga
6a75bda419
Assign some 4096 seq lengths
2023-06-05 12:07:52 -03:00
oobabooga
19f78684e6
Add "Start reply with" feature to chat mode
2023-06-02 13:58:08 -03:00
oobabooga
28198bc15c
Change some headers
2023-06-02 11:28:43 -03:00
oobabooga
5177cdf634
Change AutoGPTQ info
2023-06-02 11:19:44 -03:00
oobabooga
8e98633efd
Add a description for chat_prompt_size
2023-06-02 11:13:22 -03:00
oobabooga
5a8162a46d
Reorganize models tab
2023-06-02 02:24:15 -03:00
oobabooga
2f6631195a
Add desc_act checkbox to the UI
2023-06-02 01:45:46 -03:00
Morgan Schweers
1aed2b9e52
Make it possible to download protected HF models from the command line. ( #2408 )
2023-06-01 00:11:21 -03:00
oobabooga
486ddd62df
Add tfs and top_a to the API examples
2023-05-31 23:44:38 -03:00
oobabooga
3209440b7c
Rearrange chat buttons
2023-05-30 00:17:31 -03:00
Luis Lopez
9e7204bef4
Add tail-free and top-a sampling ( #2357 )
2023-05-29 21:40:01 -03:00
oobabooga
1394f44e14
Add triton checkbox for AutoGPTQ
2023-05-29 15:32:45 -03:00
Honkware
204731952a
Falcon support (trust-remote-code and autogptq checkboxes) ( #2367 )
...
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-05-29 10:20:18 -03:00
oobabooga
f27135bdd3
Add Eta Sampling preset
...
Also remove some presets that I do not consider relevant
2023-05-28 22:44:35 -03:00
oobabooga
00ebea0b2a
Use YAML for presets and settings
2023-05-28 22:34:12 -03:00
oobabooga
fc33216477
Small fix for n_ctx in llama.cpp
2023-05-25 13:55:51 -03:00
oobabooga
37d4ad012b
Add a button for rendering markdown for any model
2023-05-25 11:59:27 -03:00
DGdev91
cf088566f8
Make llama.cpp read prompt size and seed from settings ( #2299 )
2023-05-25 10:29:31 -03:00
oobabooga
361451ba60
Add --load-in-4bit parameter ( #2320 )
2023-05-25 01:14:13 -03:00
Gabriel Terrien
fc116711b0
FIX save_model_settings
function to also update shared.model_config
( #2282 )
2023-05-24 10:01:07 -03:00
flurb18
d37a28730d
Beginning of multi-user support ( #2262 )
...
Adds a lock to generate_reply
2023-05-24 09:38:20 -03:00
Gabriel Terrien
7aed53559a
Support of the --gradio-auth flag ( #2283 )
2023-05-23 20:39:26 -03:00
oobabooga
8b9ba3d7b4
Fix a typo
2023-05-22 20:13:03 -03:00
Gabriel Terrien
0f51b64bb3
Add a "dark_theme" option to settings.json ( #2288 )
2023-05-22 19:45:11 -03:00
oobabooga
c5446ae0e2
Fix a link
2023-05-22 19:38:34 -03:00
oobabooga
c0fd7f3257
Add mirostat parameters for llama.cpp ( #2287 )
2023-05-22 19:37:24 -03:00
oobabooga
ec7437f00a
Better way to toggle light/dark mode
2023-05-22 03:19:01 -03:00
oobabooga
d46f5a58a3
Add a button for toggling dark/light mode
2023-05-22 03:11:44 -03:00
oobabooga
753f6c5250
Attempt at making interface restart more robust
2023-05-22 00:26:07 -03:00
oobabooga
30225b9dd0
Fix --no-stream queue bug
2023-05-22 00:02:59 -03:00
oobabooga
288912baf1
Add a description for the extensions checkbox group
2023-05-21 23:33:37 -03:00
oobabooga
6e77844733
Add a description for penalty_alpha
2023-05-21 23:09:30 -03:00
oobabooga
e3d578502a
Improve "Chat settings" tab appearance a bit
2023-05-21 22:58:14 -03:00
oobabooga
e116d31180
Prevent unwanted log messages from modules
2023-05-21 22:42:34 -03:00
oobabooga
d7fabe693d
Reorganize parameters tab
2023-05-21 16:24:47 -03:00
oobabooga
8ac3636966
Add epsilon_cutoff/eta_cutoff parameters ( #2258 )
2023-05-21 15:11:57 -03:00
Matthew McAllister
ab6acddcc5
Add Save/Delete character buttons ( #1870 )
...
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-05-20 21:48:45 -03:00
HappyWorldGames
a3e9769e31
Added an audible notification after text generation in web. ( #1277 )
...
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-05-19 23:16:06 -03:00
oobabooga
f052ab9c8f
Fix setting pre_layer from within the ui
2023-05-17 23:17:44 -03:00
oobabooga
fd743a0207
Small change
2023-05-17 02:34:29 -03:00
LoopLooter
aeb1b7a9c5
feature to save prompts with custom names ( #1583 )
...
---------
Co-authored-by: LoopLooter <looplooter>
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-05-17 02:30:45 -03:00
oobabooga
85f74961f9
Update "Interface mode" tab
2023-05-17 01:57:51 -03:00
oobabooga
ce21804ec7
Allow extensions to define a new tab
2023-05-17 01:31:56 -03:00
oobabooga
a84f499718
Allow extensions to define custom CSS and JS
2023-05-17 00:30:54 -03:00
oobabooga
824fa8fc0e
Attempt at making interface restart more robust
2023-05-16 22:27:43 -03:00
oobabooga
7584d46c29
Refactor models.py ( #2113 )
2023-05-16 19:52:22 -03:00
oobabooga
5cd6dd4287
Fix no-mmap bug
2023-05-16 17:35:49 -03:00
oobabooga
89e37626ab
Reorganize chat settings tab
2023-05-16 17:22:59 -03:00
Jakub Strnad
0227e738ed
Add settings UI for llama.cpp and fixed reloading of llama.cpp models ( #2087 )
2023-05-15 19:51:23 -03:00
oobabooga
3b886f9c9f
Add chat-instruct mode ( #2049 )
2023-05-14 10:43:55 -03:00
oobabooga
437d1c7ead
Fix bug in save_model_settings
2023-05-12 14:33:00 -03:00
oobabooga
146a9cb393
Allow superbooga to download URLs in parallel
2023-05-12 14:19:55 -03:00
oobabooga
e283ddc559
Change how spaces are handled in continue/generation attempts
2023-05-12 12:50:29 -03:00
oobabooga
5eaa914e1b
Fix settings.json being ignored because of config.yaml
2023-05-12 06:09:45 -03:00
oobabooga
a77965e801
Make the regex for "Save settings for this model" exact
2023-05-12 00:43:13 -03:00
oobabooga
f7dbddfff5
Add a variable for tts extensions to use
2023-05-11 16:12:46 -03:00
oobabooga
638c6a65a2
Refactor chat functions ( #2003 )
2023-05-11 15:37:04 -03:00
oobabooga
e5b1547849
Fix reload model button
2023-05-10 14:44:25 -03:00
oobabooga
3316e33d14
Remove unused code
2023-05-10 11:59:59 -03:00
oobabooga
cd36b8f739
Remove space
2023-05-10 01:41:33 -03:00
oobabooga
bdf1274b5d
Remove duplicate code
2023-05-10 01:34:04 -03:00
oobabooga
3913155c1f
Style improvements ( #1957 )
2023-05-09 22:49:39 -03:00
Wojtab
e9e75a9ec7
Generalize multimodality (llava/minigpt4 7b and 13b now supported) ( #1741 )
2023-05-09 20:18:02 -03:00
oobabooga
13e7ebfc77
Change a comment
2023-05-09 15:56:32 -03:00
LaaZa
218bd64bd1
Add the option to not automatically load the selected model ( #1762 )
...
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-05-09 15:52:35 -03:00
Kamil Szurant
641500dcb9
Use current input for Impersonate (continue impersonate feature) ( #1147 )
2023-05-09 02:37:42 -03:00
oobabooga
b5260b24f1
Add support for custom chat styles ( #1917 )
2023-05-08 12:35:03 -03:00
Matthew McAllister
0c048252b5
Fix character menu when default chat mode is 'instruct' ( #1873 )
2023-05-07 23:50:38 -03:00
oobabooga
56a5969658
Improve the separation between instruct/chat modes ( #1896 )
2023-05-07 23:47:02 -03:00
oobabooga
56f6b7052a
Sort dropdowns numerically
2023-05-05 23:14:56 -03:00
oobabooga
8aafb1f796
Refactor text_generation.py, add support for custom generation functions ( #1817 )
2023-05-05 18:53:03 -03:00
Tom Jobbins
876fbb97c0
Allow downloading model from HF branch via UI ( #1662 )
...
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-05-05 13:59:01 -03:00
oobabooga
95d04d6a8d
Better warning messages
2023-05-03 21:43:17 -03:00
Tom Jobbins
3c67fc0362
Allow groupsize 1024, needed for larger models eg 30B to lower VRAM usage ( #1660 )
2023-05-02 00:46:26 -03:00
oobabooga
a777c058af
Precise prompts for instruct mode
2023-04-26 03:21:53 -03:00
oobabooga
f39c99fa14
Load more than one LoRA with --lora, fix a bug
2023-04-25 22:58:48 -03:00
oobabooga
b6af2e56a2
Add --character flag, add character to settings.json
2023-04-24 13:19:42 -03:00
oobabooga
caaa556159
Move extensions block definition to the bottom
2023-04-24 03:30:35 -03:00
oobabooga
b1ee674d75
Make interface state (mostly) persistent on page reload
2023-04-24 03:05:47 -03:00
oobabooga
47809e28aa
Minor changes
2023-04-24 01:04:48 -03:00
Andy Salerno
654933c634
New universal API with streaming/blocking endpoints ( #990 )
...
Previous title: Add api_streaming extension and update api-example-stream to use it
* Merge with latest main
* Add parameter capturing encoder_repetition_penalty
* Change some defaults, minor fixes
* Add --api, --public-api flags
* remove unneeded/broken comment from blocking API startup. The comment is already correctly emitted in try_start_cloudflared by calling the lambda we pass in.
* Update on_start message for blocking_api, it should say 'non-streaming' and not 'streaming'
* Update the API examples
* Change a comment
* Update README
* Remove the gradio API
* Remove unused import
* Minor change
* Remove unused import
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-04-23 15:52:43 -03:00
oobabooga
2dca8bb25e
Sort imports
2023-04-21 17:20:59 -03:00
oobabooga
c238ba9532
Add a 'Count tokens' button
2023-04-21 17:18:34 -03:00
oobabooga
2d766d2e19
Improve notebook mode button sizes
2023-04-21 02:37:58 -03:00
oobabooga
b4af319fa2
Add a workaround for GALACTICA on some systems
2023-04-19 01:43:10 -03:00
oobabooga
61126f4674
Change the button styles
2023-04-19 00:56:24 -03:00
oobabooga
649e4017a5
Style improvements
2023-04-19 00:36:28 -03:00
oobabooga
c58c1d89bd
Clean method to prevent gradio from phoning home
2023-04-18 03:56:20 -03:00
oobabooga
e1b80e6fe6
Comment the gradio patch
2023-04-18 01:57:59 -03:00
oobabooga
36f7c022f2
Rename a file
2023-04-18 01:38:33 -03:00
oobabooga
00186f76f4
Monkey patch gradio to prevent it from calling home
2023-04-18 01:13:16 -03:00