oobabooga
|
09c781b16f
|
Add modules/block_requests.py
This has become unnecessary, but it could be useful in the future
for other libraries.
|
2023-06-18 16:31:14 -03:00 |
|
oobabooga
|
44f28830d1
|
Chat CSS: fix ul, li, pre styles + remove redefinitions
|
2023-06-18 15:20:51 -03:00 |
|
oobabooga
|
239b11c94b
|
Minor bug fixes
|
2023-06-17 17:57:56 -03:00 |
|
oobabooga
|
1e400218e9
|
Fix a typo
|
2023-06-16 21:01:57 -03:00 |
|
oobabooga
|
5f392122fd
|
Add gpu_split param to ExLlama
Adapted from code created by Ph0rk0z. Thank you Ph0rk0z.
|
2023-06-16 20:49:36 -03:00 |
|
oobabooga
|
83be8eacf0
|
Minor fix
|
2023-06-16 20:38:32 -03:00 |
|
oobabooga
|
9f40032d32
|
Add ExLlama support (#2444)
|
2023-06-16 20:35:38 -03:00 |
|
oobabooga
|
dea43685b0
|
Add some clarifications
|
2023-06-16 19:10:53 -03:00 |
|
oobabooga
|
7ef6a50e84
|
Reorganize model loading UI completely (#2720)
|
2023-06-16 19:00:37 -03:00 |
|
Tom Jobbins
|
646b0c889f
|
AutoGPTQ: Add UI and command line support for disabling fused attention and fused MLP (#2648)
|
2023-06-15 23:59:54 -03:00 |
|
oobabooga
|
474dc7355a
|
Allow API requests to use parameter presets
|
2023-06-14 11:32:20 -03:00 |
|
FartyPants
|
9f150aedc3
|
A small UI change in Models menu (#2640)
|
2023-06-12 01:24:44 -03:00 |
|
oobabooga
|
da5d9a28d8
|
Fix tabbed extensions showing up at the bottom of the UI
|
2023-06-11 21:20:51 -03:00 |
|
oobabooga
|
ae5e2b3470
|
Reorganize a bit
|
2023-06-11 19:50:20 -03:00 |
|
oobabooga
|
f4defde752
|
Add a menu for installing extensions
|
2023-06-11 17:11:06 -03:00 |
|
oobabooga
|
8e73806b20
|
Improve "Interface mode" appearance
|
2023-06-11 15:29:45 -03:00 |
|
oobabooga
|
ac122832f7
|
Make dropdown menus more similar to automatic1111
|
2023-06-11 14:20:16 -03:00 |
|
oobabooga
|
6133675e0f
|
Add menus for saving presets/characters/instruction templates/prompts (#2621)
|
2023-06-11 12:19:18 -03:00 |
|
brandonj60
|
b04e18d10c
|
Add Mirostat v2 sampling to transformer models (#2571)
|
2023-06-09 21:26:31 -03:00 |
|
oobabooga
|
eb2601a8c3
|
Reorganize Parameters tab
|
2023-06-06 14:51:02 -03:00 |
|
oobabooga
|
f06a1387f0
|
Reorganize Models tab
|
2023-06-06 07:58:07 -03:00 |
|
oobabooga
|
d49d299b67
|
Change a message
|
2023-06-06 07:54:56 -03:00 |
|
oobabooga
|
7ed1e35fbf
|
Reorganize Parameters tab in chat mode
|
2023-06-06 07:46:25 -03:00 |
|
oobabooga
|
00b94847da
|
Remove softprompt support
|
2023-06-06 07:42:23 -03:00 |
|
oobabooga
|
f276d88546
|
Use AutoGPTQ by default for GPTQ models
|
2023-06-05 15:41:48 -03:00 |
|
oobabooga
|
6a75bda419
|
Assign some 4096 seq lengths
|
2023-06-05 12:07:52 -03:00 |
|
oobabooga
|
19f78684e6
|
Add "Start reply with" feature to chat mode
|
2023-06-02 13:58:08 -03:00 |
|
oobabooga
|
28198bc15c
|
Change some headers
|
2023-06-02 11:28:43 -03:00 |
|
oobabooga
|
5177cdf634
|
Change AutoGPTQ info
|
2023-06-02 11:19:44 -03:00 |
|
oobabooga
|
8e98633efd
|
Add a description for chat_prompt_size
|
2023-06-02 11:13:22 -03:00 |
|
oobabooga
|
5a8162a46d
|
Reorganize models tab
|
2023-06-02 02:24:15 -03:00 |
|
oobabooga
|
2f6631195a
|
Add desc_act checkbox to the UI
|
2023-06-02 01:45:46 -03:00 |
|
Morgan Schweers
|
1aed2b9e52
|
Make it possible to download protected HF models from the command line. (#2408)
|
2023-06-01 00:11:21 -03:00 |
|
oobabooga
|
486ddd62df
|
Add tfs and top_a to the API examples
|
2023-05-31 23:44:38 -03:00 |
|
oobabooga
|
3209440b7c
|
Rearrange chat buttons
|
2023-05-30 00:17:31 -03:00 |
|
Luis Lopez
|
9e7204bef4
|
Add tail-free and top-a sampling (#2357)
|
2023-05-29 21:40:01 -03:00 |
|
oobabooga
|
1394f44e14
|
Add triton checkbox for AutoGPTQ
|
2023-05-29 15:32:45 -03:00 |
|
Honkware
|
204731952a
|
Falcon support (trust-remote-code and autogptq checkboxes) (#2367)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2023-05-29 10:20:18 -03:00 |
|
oobabooga
|
f27135bdd3
|
Add Eta Sampling preset
Also remove some presets that I do not consider relevant
|
2023-05-28 22:44:35 -03:00 |
|
oobabooga
|
00ebea0b2a
|
Use YAML for presets and settings
|
2023-05-28 22:34:12 -03:00 |
|
oobabooga
|
fc33216477
|
Small fix for n_ctx in llama.cpp
|
2023-05-25 13:55:51 -03:00 |
|
oobabooga
|
37d4ad012b
|
Add a button for rendering markdown for any model
|
2023-05-25 11:59:27 -03:00 |
|
DGdev91
|
cf088566f8
|
Make llama.cpp read prompt size and seed from settings (#2299)
|
2023-05-25 10:29:31 -03:00 |
|
oobabooga
|
361451ba60
|
Add --load-in-4bit parameter (#2320)
|
2023-05-25 01:14:13 -03:00 |
|
Gabriel Terrien
|
fc116711b0
|
FIX save_model_settings function to also update shared.model_config (#2282)
|
2023-05-24 10:01:07 -03:00 |
|
flurb18
|
d37a28730d
|
Beginning of multi-user support (#2262)
Adds a lock to generate_reply
|
2023-05-24 09:38:20 -03:00 |
|
Gabriel Terrien
|
7aed53559a
|
Support of the --gradio-auth flag (#2283)
|
2023-05-23 20:39:26 -03:00 |
|
oobabooga
|
8b9ba3d7b4
|
Fix a typo
|
2023-05-22 20:13:03 -03:00 |
|
Gabriel Terrien
|
0f51b64bb3
|
Add a "dark_theme" option to settings.json (#2288)
|
2023-05-22 19:45:11 -03:00 |
|
oobabooga
|
c5446ae0e2
|
Fix a link
|
2023-05-22 19:38:34 -03:00 |
|
oobabooga
|
c0fd7f3257
|
Add mirostat parameters for llama.cpp (#2287)
|
2023-05-22 19:37:24 -03:00 |
|
oobabooga
|
ec7437f00a
|
Better way to toggle light/dark mode
|
2023-05-22 03:19:01 -03:00 |
|
oobabooga
|
d46f5a58a3
|
Add a button for toggling dark/light mode
|
2023-05-22 03:11:44 -03:00 |
|
oobabooga
|
753f6c5250
|
Attempt at making interface restart more robust
|
2023-05-22 00:26:07 -03:00 |
|
oobabooga
|
30225b9dd0
|
Fix --no-stream queue bug
|
2023-05-22 00:02:59 -03:00 |
|
oobabooga
|
288912baf1
|
Add a description for the extensions checkbox group
|
2023-05-21 23:33:37 -03:00 |
|
oobabooga
|
6e77844733
|
Add a description for penalty_alpha
|
2023-05-21 23:09:30 -03:00 |
|
oobabooga
|
e3d578502a
|
Improve "Chat settings" tab appearance a bit
|
2023-05-21 22:58:14 -03:00 |
|
oobabooga
|
e116d31180
|
Prevent unwanted log messages from modules
|
2023-05-21 22:42:34 -03:00 |
|
oobabooga
|
d7fabe693d
|
Reorganize parameters tab
|
2023-05-21 16:24:47 -03:00 |
|
oobabooga
|
8ac3636966
|
Add epsilon_cutoff/eta_cutoff parameters (#2258)
|
2023-05-21 15:11:57 -03:00 |
|
Matthew McAllister
|
ab6acddcc5
|
Add Save/Delete character buttons (#1870)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2023-05-20 21:48:45 -03:00 |
|
HappyWorldGames
|
a3e9769e31
|
Added an audible notification after text generation in web. (#1277)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2023-05-19 23:16:06 -03:00 |
|
oobabooga
|
f052ab9c8f
|
Fix setting pre_layer from within the ui
|
2023-05-17 23:17:44 -03:00 |
|
oobabooga
|
fd743a0207
|
Small change
|
2023-05-17 02:34:29 -03:00 |
|
LoopLooter
|
aeb1b7a9c5
|
feature to save prompts with custom names (#1583)
---------
Co-authored-by: LoopLooter <looplooter>
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2023-05-17 02:30:45 -03:00 |
|
oobabooga
|
85f74961f9
|
Update "Interface mode" tab
|
2023-05-17 01:57:51 -03:00 |
|
oobabooga
|
ce21804ec7
|
Allow extensions to define a new tab
|
2023-05-17 01:31:56 -03:00 |
|
oobabooga
|
a84f499718
|
Allow extensions to define custom CSS and JS
|
2023-05-17 00:30:54 -03:00 |
|
oobabooga
|
824fa8fc0e
|
Attempt at making interface restart more robust
|
2023-05-16 22:27:43 -03:00 |
|
oobabooga
|
7584d46c29
|
Refactor models.py (#2113)
|
2023-05-16 19:52:22 -03:00 |
|
oobabooga
|
5cd6dd4287
|
Fix no-mmap bug
|
2023-05-16 17:35:49 -03:00 |
|
oobabooga
|
89e37626ab
|
Reorganize chat settings tab
|
2023-05-16 17:22:59 -03:00 |
|
Jakub Strnad
|
0227e738ed
|
Add settings UI for llama.cpp and fixed reloading of llama.cpp models (#2087)
|
2023-05-15 19:51:23 -03:00 |
|
oobabooga
|
3b886f9c9f
|
Add chat-instruct mode (#2049)
|
2023-05-14 10:43:55 -03:00 |
|
oobabooga
|
437d1c7ead
|
Fix bug in save_model_settings
|
2023-05-12 14:33:00 -03:00 |
|
oobabooga
|
146a9cb393
|
Allow superbooga to download URLs in parallel
|
2023-05-12 14:19:55 -03:00 |
|
oobabooga
|
e283ddc559
|
Change how spaces are handled in continue/generation attempts
|
2023-05-12 12:50:29 -03:00 |
|
oobabooga
|
5eaa914e1b
|
Fix settings.json being ignored because of config.yaml
|
2023-05-12 06:09:45 -03:00 |
|
oobabooga
|
a77965e801
|
Make the regex for "Save settings for this model" exact
|
2023-05-12 00:43:13 -03:00 |
|
oobabooga
|
f7dbddfff5
|
Add a variable for tts extensions to use
|
2023-05-11 16:12:46 -03:00 |
|
oobabooga
|
638c6a65a2
|
Refactor chat functions (#2003)
|
2023-05-11 15:37:04 -03:00 |
|
oobabooga
|
e5b1547849
|
Fix reload model button
|
2023-05-10 14:44:25 -03:00 |
|
oobabooga
|
3316e33d14
|
Remove unused code
|
2023-05-10 11:59:59 -03:00 |
|
oobabooga
|
cd36b8f739
|
Remove space
|
2023-05-10 01:41:33 -03:00 |
|
oobabooga
|
bdf1274b5d
|
Remove duplicate code
|
2023-05-10 01:34:04 -03:00 |
|
oobabooga
|
3913155c1f
|
Style improvements (#1957)
|
2023-05-09 22:49:39 -03:00 |
|
Wojtab
|
e9e75a9ec7
|
Generalize multimodality (llava/minigpt4 7b and 13b now supported) (#1741)
|
2023-05-09 20:18:02 -03:00 |
|
oobabooga
|
13e7ebfc77
|
Change a comment
|
2023-05-09 15:56:32 -03:00 |
|
LaaZa
|
218bd64bd1
|
Add the option to not automatically load the selected model (#1762)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2023-05-09 15:52:35 -03:00 |
|
Kamil Szurant
|
641500dcb9
|
Use current input for Impersonate (continue impersonate feature) (#1147)
|
2023-05-09 02:37:42 -03:00 |
|
oobabooga
|
b5260b24f1
|
Add support for custom chat styles (#1917)
|
2023-05-08 12:35:03 -03:00 |
|
Matthew McAllister
|
0c048252b5
|
Fix character menu when default chat mode is 'instruct' (#1873)
|
2023-05-07 23:50:38 -03:00 |
|
oobabooga
|
56a5969658
|
Improve the separation between instruct/chat modes (#1896)
|
2023-05-07 23:47:02 -03:00 |
|
oobabooga
|
56f6b7052a
|
Sort dropdowns numerically
|
2023-05-05 23:14:56 -03:00 |
|
oobabooga
|
8aafb1f796
|
Refactor text_generation.py, add support for custom generation functions (#1817)
|
2023-05-05 18:53:03 -03:00 |
|
Tom Jobbins
|
876fbb97c0
|
Allow downloading model from HF branch via UI (#1662)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2023-05-05 13:59:01 -03:00 |
|
oobabooga
|
95d04d6a8d
|
Better warning messages
|
2023-05-03 21:43:17 -03:00 |
|
Tom Jobbins
|
3c67fc0362
|
Allow groupsize 1024, needed for larger models eg 30B to lower VRAM usage (#1660)
|
2023-05-02 00:46:26 -03:00 |
|
oobabooga
|
a777c058af
|
Precise prompts for instruct mode
|
2023-04-26 03:21:53 -03:00 |
|