Commit Graph

555 Commits

Author SHA1 Message Date
oobabooga
239b11c94b Minor bug fixes 2023-06-17 17:57:56 -03:00
oobabooga
1e400218e9 Fix a typo 2023-06-16 21:01:57 -03:00
oobabooga
5f392122fd Add gpu_split param to ExLlama
Adapted from code created by Ph0rk0z. Thank you Ph0rk0z.
2023-06-16 20:49:36 -03:00
oobabooga
83be8eacf0 Minor fix 2023-06-16 20:38:32 -03:00
oobabooga
9f40032d32
Add ExLlama support (#2444) 2023-06-16 20:35:38 -03:00
oobabooga
dea43685b0 Add some clarifications 2023-06-16 19:10:53 -03:00
oobabooga
7ef6a50e84
Reorganize model loading UI completely (#2720) 2023-06-16 19:00:37 -03:00
Tom Jobbins
646b0c889f
AutoGPTQ: Add UI and command line support for disabling fused attention and fused MLP (#2648) 2023-06-15 23:59:54 -03:00
oobabooga
474dc7355a Allow API requests to use parameter presets 2023-06-14 11:32:20 -03:00
FartyPants
9f150aedc3
A small UI change in Models menu (#2640) 2023-06-12 01:24:44 -03:00
oobabooga
da5d9a28d8 Fix tabbed extensions showing up at the bottom of the UI 2023-06-11 21:20:51 -03:00
oobabooga
ae5e2b3470 Reorganize a bit 2023-06-11 19:50:20 -03:00
oobabooga
f4defde752 Add a menu for installing extensions 2023-06-11 17:11:06 -03:00
oobabooga
8e73806b20 Improve "Interface mode" appearance 2023-06-11 15:29:45 -03:00
oobabooga
ac122832f7 Make dropdown menus more similar to automatic1111 2023-06-11 14:20:16 -03:00
oobabooga
6133675e0f
Add menus for saving presets/characters/instruction templates/prompts (#2621) 2023-06-11 12:19:18 -03:00
brandonj60
b04e18d10c
Add Mirostat v2 sampling to transformer models (#2571) 2023-06-09 21:26:31 -03:00
oobabooga
eb2601a8c3 Reorganize Parameters tab 2023-06-06 14:51:02 -03:00
oobabooga
f06a1387f0 Reorganize Models tab 2023-06-06 07:58:07 -03:00
oobabooga
d49d299b67 Change a message 2023-06-06 07:54:56 -03:00
oobabooga
7ed1e35fbf Reorganize Parameters tab in chat mode 2023-06-06 07:46:25 -03:00
oobabooga
00b94847da Remove softprompt support 2023-06-06 07:42:23 -03:00
oobabooga
f276d88546 Use AutoGPTQ by default for GPTQ models 2023-06-05 15:41:48 -03:00
oobabooga
6a75bda419 Assign some 4096 seq lengths 2023-06-05 12:07:52 -03:00
oobabooga
19f78684e6 Add "Start reply with" feature to chat mode 2023-06-02 13:58:08 -03:00
oobabooga
28198bc15c Change some headers 2023-06-02 11:28:43 -03:00
oobabooga
5177cdf634 Change AutoGPTQ info 2023-06-02 11:19:44 -03:00
oobabooga
8e98633efd Add a description for chat_prompt_size 2023-06-02 11:13:22 -03:00
oobabooga
5a8162a46d Reorganize models tab 2023-06-02 02:24:15 -03:00
oobabooga
2f6631195a Add desc_act checkbox to the UI 2023-06-02 01:45:46 -03:00
Morgan Schweers
1aed2b9e52
Make it possible to download protected HF models from the command line. (#2408) 2023-06-01 00:11:21 -03:00
oobabooga
486ddd62df Add tfs and top_a to the API examples 2023-05-31 23:44:38 -03:00
oobabooga
3209440b7c
Rearrange chat buttons 2023-05-30 00:17:31 -03:00
Luis Lopez
9e7204bef4
Add tail-free and top-a sampling (#2357) 2023-05-29 21:40:01 -03:00
oobabooga
1394f44e14 Add triton checkbox for AutoGPTQ 2023-05-29 15:32:45 -03:00
Honkware
204731952a
Falcon support (trust-remote-code and autogptq checkboxes) (#2367)
---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-05-29 10:20:18 -03:00
oobabooga
f27135bdd3 Add Eta Sampling preset
Also remove some presets that I do not consider relevant
2023-05-28 22:44:35 -03:00
oobabooga
00ebea0b2a Use YAML for presets and settings 2023-05-28 22:34:12 -03:00
oobabooga
fc33216477 Small fix for n_ctx in llama.cpp 2023-05-25 13:55:51 -03:00
oobabooga
37d4ad012b Add a button for rendering markdown for any model 2023-05-25 11:59:27 -03:00
DGdev91
cf088566f8
Make llama.cpp read prompt size and seed from settings (#2299) 2023-05-25 10:29:31 -03:00
oobabooga
361451ba60
Add --load-in-4bit parameter (#2320) 2023-05-25 01:14:13 -03:00
Gabriel Terrien
fc116711b0
FIX save_model_settings function to also update shared.model_config (#2282) 2023-05-24 10:01:07 -03:00
flurb18
d37a28730d
Beginning of multi-user support (#2262)
Adds a lock to generate_reply
2023-05-24 09:38:20 -03:00
Gabriel Terrien
7aed53559a
Support of the --gradio-auth flag (#2283) 2023-05-23 20:39:26 -03:00
oobabooga
8b9ba3d7b4 Fix a typo 2023-05-22 20:13:03 -03:00
Gabriel Terrien
0f51b64bb3
Add a "dark_theme" option to settings.json (#2288) 2023-05-22 19:45:11 -03:00
oobabooga
c5446ae0e2 Fix a link 2023-05-22 19:38:34 -03:00
oobabooga
c0fd7f3257
Add mirostat parameters for llama.cpp (#2287) 2023-05-22 19:37:24 -03:00
oobabooga
ec7437f00a
Better way to toggle light/dark mode 2023-05-22 03:19:01 -03:00