Commit Graph

654 Commits

Author SHA1 Message Date
oobabooga
b1ee674d75 Make interface state (mostly) persistent on page reload 2023-04-24 03:05:47 -03:00
oobabooga
47809e28aa Minor changes 2023-04-24 01:04:48 -03:00
Andy Salerno
654933c634
New universal API with streaming/blocking endpoints (#990)
Previous title: Add api_streaming extension and update api-example-stream to use it

* Merge with latest main

* Add parameter capturing encoder_repetition_penalty

* Change some defaults, minor fixes

* Add --api, --public-api flags

* remove unneeded/broken comment from blocking API startup. The comment is already correctly emitted in try_start_cloudflared by calling the lambda we pass in.

* Update on_start message for blocking_api, it should say 'non-streaming' and not 'streaming'

* Update the API examples

* Change a comment

* Update README

* Remove the gradio API

* Remove unused import

* Minor change

* Remove unused import

---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-04-23 15:52:43 -03:00
oobabooga
2dca8bb25e Sort imports 2023-04-21 17:20:59 -03:00
oobabooga
c238ba9532 Add a 'Count tokens' button 2023-04-21 17:18:34 -03:00
oobabooga
2d766d2e19 Improve notebook mode button sizes 2023-04-21 02:37:58 -03:00
oobabooga
b4af319fa2 Add a workaround for GALACTICA on some systems 2023-04-19 01:43:10 -03:00
oobabooga
61126f4674 Change the button styles 2023-04-19 00:56:24 -03:00
oobabooga
649e4017a5 Style improvements 2023-04-19 00:36:28 -03:00
oobabooga
c58c1d89bd
Clean method to prevent gradio from phoning home 2023-04-18 03:56:20 -03:00
oobabooga
e1b80e6fe6
Comment the gradio patch 2023-04-18 01:57:59 -03:00
oobabooga
36f7c022f2
Rename a file 2023-04-18 01:38:33 -03:00
oobabooga
00186f76f4
Monkey patch gradio to prevent it from calling home 2023-04-18 01:13:16 -03:00
oobabooga
c3dc348d1c Don't show 'None' in the LoRA list 2023-04-17 13:52:23 -03:00
oobabooga
209fcd21d5 Reorganize Parameters tab 2023-04-17 00:33:22 -03:00
oobabooga
b937c9d8c2
Add skip_special_tokens checkbox for Dolly model (#1218) 2023-04-16 14:24:49 -03:00
oobabooga
a9c7ef4159 Exclude yaml files from model list 2023-04-16 12:47:30 -03:00
Mikel Bober-Irizar
16a3a5b039
Merge pull request from GHSA-hv5m-3rp9-xcpf
* Remove eval of API input

* Remove unnecessary eval/exec for security

* Use ast.literal_eval

* Use ast.literal_eval

---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-04-16 01:36:50 -03:00
oobabooga
ac189011cb Add "Save current settings for this model" button 2023-04-15 12:54:02 -03:00
oobabooga
b9dcba7762 Don't overwrite --gpu_memory on boot (#1237/#1235) 2023-04-15 11:59:31 -03:00
oobabooga
628f8e6168 Reorganize chat buttons 2023-04-14 23:17:15 -03:00
oobabooga
c4aa1a42b1 Fix chat history downloading 2023-04-14 19:38:30 -03:00
oobabooga
3a337cfded Use argparse defaults 2023-04-14 15:35:06 -03:00
oobabooga
43e01282b3 Don't override user initial wbits/groupsize 2023-04-14 15:24:03 -03:00
Alex "mcmonkey" Goodwin
64e3b44e0f
initial multi-lora support (#1103)
---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-04-14 14:52:06 -03:00
oobabooga
ebb81eb176 Add Undo/Regenerate buttons to notebook mode 2023-04-14 14:34:56 -03:00
oobabooga
581f739b2f Reorganize 2023-04-14 11:38:26 -03:00
oobabooga
8e31f2bad4
Automatically set wbits/groupsize/instruct based on model name (#1167) 2023-04-14 11:07:28 -03:00
v0xie
9d66957207
Add --listen-host launch option (#1122) 2023-04-13 21:35:08 -03:00
oobabooga
c13e8651ad Suppress "TypedStorage is deprecated" warnings 2023-04-13 12:09:42 -03:00
oobabooga
17ce7c8671 Suppress annoying bitsandbytes welcome message 2023-04-13 12:04:39 -03:00
oobabooga
fbb448ce4f If only 1 model is available, load that model 2023-04-13 11:44:10 -03:00
oobabooga
5744b31593 Reorganize some buttons 2023-04-13 11:05:47 -03:00
Xan
6e19ae4b2f
Fix gpt-j model type in UI (#1129) 2023-04-13 10:17:20 -03:00
oobabooga
ddbd237ec9 Better way to sort the models/loras 2023-04-12 22:56:32 -03:00
oobabooga
7dfbe54f42 Add --model-menu option 2023-04-12 21:24:26 -03:00
oobabooga
86c10c6f0c Add some labels 2023-04-12 18:39:21 -03:00
oobabooga
0baa50bcc4 Update a comment 2023-04-12 18:26:15 -03:00
oobabooga
5d1d0bd11f Add the GPU index to the label 2023-04-12 18:24:19 -03:00
oobabooga
13789fd200 Handle the no-GPU / multi-GPU cases 2023-04-12 18:21:14 -03:00
oobabooga
1566d8e344 Add model settings to the Models tab 2023-04-12 17:20:18 -03:00
oobabooga
80f4eabb2a Fix send_pictures extension 2023-04-12 10:27:06 -03:00
oobabooga
2289d3686f Update API example 2023-04-11 22:43:43 -03:00
oobabooga
f2be87235d Comment lines that were causing undefined behavior 2023-04-11 22:40:04 -03:00
oobabooga
8265d45db8 Add send dummy message/reply buttons
Useful for starting a new reply.
2023-04-11 22:21:41 -03:00
oobabooga
f2ec880e81 Auto-scroll to the bottom when streaming is over in notebook/default modes 2023-04-11 20:58:10 -03:00
oobabooga
cacbcda208
Two new options: truncation length and ban eos token 2023-04-11 18:46:06 -03:00
catalpaaa
78bbc66fc4
allow custom stopping strings in all modes (#903) 2023-04-11 12:30:06 -03:00
oobabooga
0f212093a3
Refactor the UI
A single dictionary called 'interface_state' is now passed as input to all functions. The values are updated only when necessary.

The goal is to make it easier to add new elements to the UI.
2023-04-11 11:46:30 -03:00
oobabooga
58b34c0841 Fix chat_prompt_size 2023-04-10 20:06:42 -03:00
Alex "mcmonkey" Goodwin
0caf718a21
add on-page documentation to parameters (#1008) 2023-04-10 17:19:12 -03:00
oobabooga
bd04ff27ad Make the bos token optional 2023-04-10 16:44:22 -03:00
oobabooga
0f1627eff1 Don't treat Intruct mode histories as regular histories
* They must now be saved/loaded manually
* Also improved browser caching of pfps
* Also changed the global default preset
2023-04-10 15:48:07 -03:00
oobabooga
d679c4be13 Change a label 2023-04-10 11:44:37 -03:00
oobabooga
45244ed125 More descriptive download info 2023-04-10 11:42:12 -03:00
oobabooga
11b23db8d4 Remove unused imports 2023-04-10 11:37:42 -03:00
oobabooga
2c14df81a8 Use download-model.py to download the model 2023-04-10 11:36:39 -03:00
oobabooga
c6e9ba20a4 Merge branch 'main' into UsamaKenway-main 2023-04-10 11:14:03 -03:00
oobabooga
d29f4624e9 Add a Continue button to chat mode 2023-04-09 20:04:16 -03:00
oobabooga
f91d3a3ff4 server.py readability 2023-04-09 14:46:32 -03:00
Usama Kenway
ebdf4c8c12 path fixed 2023-04-09 16:53:21 +05:00
Usama Kenway
7436dd5b4a download custom model menu (from hugging face) added in model tab 2023-04-09 16:11:43 +05:00
oobabooga
cb169d0834 Minor formatting changes 2023-04-08 17:34:07 -03:00
oobabooga
2f16d0afca Remove redundant events 2023-04-08 17:32:36 -03:00
oobabooga
a6a00cb82f
Properly concatenate chat events 2023-04-08 17:25:21 -03:00
Φφ
ffd102e5c0
SD Api Pics extension, v.1.1 (#596) 2023-04-07 21:36:04 -03:00
oobabooga
5543a5089d Auto-submit the whisper extension transcription 2023-04-07 15:57:51 -03:00
oobabooga
1dc464dcb0 Sort imports 2023-04-07 14:42:03 -03:00
oobabooga
962e33dc10 Change button style 2023-04-07 12:22:14 -03:00
Maya
744bf7cbf2
Get rid of type parameter warning (#883)
Fix annoying `The 'type' parameter has been deprecated. Use the Number component instead` warning
2023-04-07 11:17:16 -03:00
oobabooga
ea6e77df72
Make the code more like PEP8 for readability (#862) 2023-04-07 00:15:45 -03:00
oobabooga
5b301d9a02 Create a Model tab 2023-04-06 01:54:05 -03:00
oobabooga
4a400320dd Clean up 2023-04-06 01:47:00 -03:00
Randell Miller
641646a801
Fix crash if missing instructions directory (#812) 2023-04-06 01:24:22 -03:00
oobabooga
3f3e42e26c
Refactor several function calls and the API 2023-04-06 01:22:15 -03:00
oobabooga
7f66421369 Fix loading characters 2023-04-05 14:22:32 -03:00
oobabooga
90141bc1a8 Fix saving prompts on Windows 2023-04-05 14:08:54 -03:00
oobabooga
cf2c4e740b Disable gradio analytics globally 2023-04-05 14:05:50 -03:00
oobabooga
e722c240af Add Instruct mode 2023-04-05 13:54:50 -03:00
oobabooga
ae1fe45bc0 One more cache reset 2023-04-04 23:15:57 -03:00
oobabooga
80dfba05f3 Better crop/resize cached images 2023-04-04 22:52:15 -03:00
oobabooga
65d8a24a6d Show profile pictures in the Character tab 2023-04-04 22:28:49 -03:00
oobabooga
8de22ac82a Merge character upload tabs 2023-04-03 18:01:45 -03:00
oobabooga
3012bdb5e0 Fix a label 2023-04-03 12:20:53 -03:00
OWKenobi
dcf61a8897
"character greeting" displayed and editable on the fly (#743)
* Add greetings field

* add greeting field and make it interactive

* Minor changes

* Fix a bug

* Simplify clear_chat_log

* Change a label

* Minor change

* Simplifications

* Simplification

* Simplify loading the default character history

* Fix regression

---------

Co-authored-by: oobabooga
2023-04-03 12:16:15 -03:00
oobabooga
2a267011dc Use Path.stem for simplicity 2023-04-03 00:56:14 -03:00
TheTerrasque
2157bb4319
New yaml character format (#337 from TheTerrasque/feature/yaml-characters)
This doesn't break backward compatibility with JSON characters.
2023-04-02 20:34:25 -03:00
oobabooga
0dc6fa038b Use gr.State() to store the user input 2023-04-02 18:05:21 -03:00
Brian O'Connor
d0f9625f0b Clear text input for chat
Add logic to clear the textbox for chat input when the user submits or hits the generate button.
2023-04-01 21:48:24 -04:00
oobabooga
b0890a7925 Add shared.is_chat() function 2023-04-01 20:15:00 -03:00
oobabooga
8c51b405e4 Progress towards generalizing Interface mode tab 2023-03-31 23:41:10 -03:00
oobabooga
1d1d9e40cd Add seed to settings 2023-03-31 12:22:07 -03:00
oobabooga
fd72afd8e7 Increase the textbox sizes 2023-03-31 00:43:00 -03:00
oobabooga
bd65940a48 Increase --chat box height 2023-03-30 00:43:49 -03:00
oobabooga
55755e27b9 Don't hardcode prompts in the settings dict/json 2023-03-29 22:47:01 -03:00
oobabooga
1cb9246160 Adapt to the new model names 2023-03-29 21:47:36 -03:00
oobabooga
cac577d99f Fix interface reloading 2023-03-28 13:25:58 -03:00
Alex "mcmonkey" Goodwin
9cc811a0e6 fix LoRA path typo in #549 2023-03-27 22:16:40 -07:00
Alex "mcmonkey" Goodwin
31f04dc615 Merge branch 'main' into add-train-lora-tab 2023-03-27 20:03:30 -07:00
oobabooga
005f552ea3 Some simplifications 2023-03-27 23:29:52 -03:00
oobabooga
fde92048af Merge branch 'main' into catalpaaa-lora-and-model-dir 2023-03-27 23:16:44 -03:00
oobabooga
2f0571bfa4 Small style changes 2023-03-27 21:24:39 -03:00
oobabooga
c2cad30772 Merge branch 'main' into mcmonkey4eva-add-train-lora-tab 2023-03-27 21:05:44 -03:00
oobabooga
641e1a09a7 Don't flash when selecting a new prompt 2023-03-27 14:48:43 -03:00
oobabooga
268abd1cba Add some space in notebook mode 2023-03-27 13:52:12 -03:00
Alex "mcmonkey" Goodwin
c07bcd0850 add some outputs to indicate progress updates (sorta)
Actual progressbar still needed. Also minor formatting fixes.
2023-03-27 09:41:06 -07:00
oobabooga
af65c12900 Change Stop button behavior 2023-03-27 13:23:59 -03:00
oobabooga
572bafcd24 Less verbose message 2023-03-27 12:43:37 -03:00
Alex "mcmonkey" Goodwin
2afe1c13c1 move Training to before Interface mode
as Interface Mode seems to be a core 'settings' page that naturally belongs at the very end
2023-03-27 08:32:32 -07:00
oobabooga
202e981d00 Make Generate/Stop buttons smaller in notebook mode 2023-03-27 12:30:57 -03:00
Alex "mcmonkey" Goodwin
e439228ed8 Merge branch 'main' into add-train-lora-tab 2023-03-27 08:21:19 -07:00
oobabooga
57345b8f30 Add prompt loading/saving menus + reorganize interface 2023-03-27 12:16:37 -03:00
oobabooga
95c97e1747 Unload the model using the "Remove all" button 2023-03-26 23:47:29 -03:00
oobabooga
e07c9e3093 Merge branch 'main' into Brawlence-main 2023-03-26 23:40:51 -03:00
oobabooga
1c77fdca4c Change notebook mode appearance 2023-03-26 22:20:30 -03:00
oobabooga
49c10c5570
Add support for the latest GPTQ models with group-size (#530)
**Warning: old 4-bit weights will not work anymore!**

See here how to get up to date weights: https://github.com/oobabooga/text-generation-webui/wiki/LLaMA-model#step-2-get-the-pre-converted-weights
2023-03-26 00:11:33 -03:00
Alex "mcmonkey" Goodwin
566898a79a initial lora training tab 2023-03-25 12:08:26 -07:00
catalpaaa
d51cb8292b Update server.py
yea i should go to bed
2023-03-24 17:36:31 -07:00
catalpaaa
9e2963e0c8 Update server.py 2023-03-24 17:35:45 -07:00
catalpaaa
ec2a1facee Update server.py 2023-03-24 17:34:33 -07:00
catalpaaa
b37c54edcf lora-dir, model-dir and login auth
Added lora-dir, model-dir, and a login auth arguments that points to a file contains usernames and passwords in the format of "u:pw,u:pw,..."
2023-03-24 17:30:18 -07:00
oobabooga
d8e950d6bd
Don't load the model twice when using --lora 2023-03-24 16:30:32 -03:00
oobabooga
fd99995b01
Make the Stop button more consistent in chat mode 2023-03-24 15:59:27 -03:00
oobabooga
9bdb3c784d
Minor fix 2023-03-23 22:02:40 -03:00
oobabooga
bf22d16ebc
Clear cache while switching LoRAs 2023-03-23 21:56:26 -03:00
Φφ
483d173d23 Code reuse + indication
Now shows the message in the console when unloading weights. Also reload_model() calls unload_model() first to free the memory so that multiple reloads won't overfill it.
2023-03-23 07:06:26 +03:00
Φφ
1917b15275 Unload and reload models on request 2023-03-23 07:06:26 +03:00
wywywywy
61346b88ea
Add "seed" menu in the Parameters tab 2023-03-22 15:40:20 -03:00
oobabooga
4d701a6eb9 Create a mirror for the preset menu 2023-03-19 12:51:47 -03:00
oobabooga
20f5b455bf Add parameters reference #386 #331 2023-03-17 20:19:04 -03:00
oobabooga
a717fd709d Sort the imports 2023-03-17 11:42:25 -03:00
oobabooga
29fe7b1c74 Remove LoRA tab, move it into the Parameters menu 2023-03-17 11:39:48 -03:00
oobabooga
214dc6868e Several QoL changes related to LoRA 2023-03-17 11:24:52 -03:00
oobabooga
104293f411 Add LoRA support 2023-03-16 21:31:39 -03:00
oobabooga
38d7017657 Add all command-line flags to "Interface mode" 2023-03-16 12:44:03 -03:00
oobabooga
d54f3f4a34 Add no-stream checkbox to the interface 2023-03-16 10:19:00 -03:00
oobabooga
25a00eaf98 Add "Experimental" warning 2023-03-15 23:43:35 -03:00
oobabooga
599d3139fd Increase the reload timeout a bit 2023-03-15 23:34:08 -03:00
oobabooga
4d64a57092 Add Interface mode tab 2023-03-15 23:29:56 -03:00
oobabooga
ffb898608b Mini refactor 2023-03-15 20:44:34 -03:00
oobabooga
67d62475dc Further reorganize chat UI 2023-03-15 18:56:26 -03:00
oobabooga
c1959c26ee Show/hide the extensions block using javascript 2023-03-15 16:35:28 -03:00
oobabooga
348596f634 Fix broken extensions 2023-03-15 15:11:16 -03:00
oobabooga
658849d6c3 Move a checkbutton 2023-03-15 13:29:00 -03:00
oobabooga
d30a14087f Further reorganize the UI 2023-03-15 13:24:54 -03:00
oobabooga
ffc6cb3116
Merge pull request #325 from Ph0rk0z/fix-RWKV-Names
Fix rwkv names
2023-03-15 12:56:21 -03:00
oobabooga
1413931705 Add a header bar and redesign the interface (#293) 2023-03-15 12:01:32 -03:00
oobabooga
9d6a625bd6 Add 'hallucinations' filter #326
This breaks the API since a new parameter has been added.
It should be a one-line fix. See api-example.py.
2023-03-15 11:10:35 -03:00
Forkoz
3b62bd180d
Remove PTH extension from RWKV
When loading the current model was blank unless you typed it out.
2023-03-14 21:23:39 +00:00
Forkoz
f0f325eac1
Remove Json from loading
no more 20b tokenizer
2023-03-14 21:21:47 +00:00
oobabooga
72d207c098
Remove the chat API
It is not implemented, has not been tested, and this is causing confusion.
2023-03-14 16:31:27 -03:00
oobabooga
a95592fc56 Add back a progress indicator to --no-stream 2023-03-12 20:38:40 -03:00
oobabooga
bcf0075278
Merge pull request #235 from xanthousm/Quality_of_life-main
--auto-launch and "Is typing..."
2023-03-12 03:12:56 -03:00
oobabooga
92fe947721 Merge branch 'main' into new-streaming 2023-03-11 19:59:45 -03:00
oobabooga
2743dd736a Add *Is typing...* to impersonate as well 2023-03-11 10:50:18 -03:00
Xan
96c51973f9 --auto-launch and "Is typing..."
- Added `--auto-launch` arg to open web UI in the default browser when ready.
- Changed chat.py to display user input immediately and "*Is typing...*" as a temporary reply while generating text. Most noticeable when using `--no-stream`.
2023-03-11 22:50:59 +11:00
oobabooga
9849aac0f1 Don't show .pt models in the list 2023-03-09 21:54:50 -03:00
oobabooga
038e90765b Rename to "Text generation web UI" 2023-03-09 09:44:08 -03:00
jtang613
807a41cf87 Lets propose a name besides "Gradio" 2023-03-08 21:02:25 -05:00
oobabooga
ab50f80542 New text streaming method (much faster) 2023-03-08 02:46:35 -03:00
oobabooga
bf56b6c1fb Load settings.json without the need for --settings settings.json
This is for setting UI defaults
2023-03-06 10:57:45 -03:00
oobabooga
bcea196c9d Bump flexgen version 2023-03-02 12:03:57 -03:00
oobabooga
169209805d Model-aware prompts and presets 2023-03-02 11:25:04 -03:00
oobabooga
99dc95e14e Minor aesthetic change 2023-03-01 19:32:04 -03:00
oobabooga
a1429d1607 Add default extensions to the settings 2023-02-28 02:20:11 -03:00
oobabooga
365e1089b3 Move some buttons 2023-02-28 01:34:07 -03:00
oobabooga
43b6ab8673 Store thumbnails as files instead of base64 strings
This improves the UI responsiveness for large histories.
2023-02-27 13:41:00 -03:00
oobabooga
611010e8af Add a confirmation to clear history 2023-02-27 11:41:21 -03:00
oobabooga
7a776ccf87 Make the gallery interactive to load characters 2023-02-26 17:19:36 -03:00
oobabooga
e91eb24649 Decrease the repetition penalty upper limit to 3 2023-02-26 01:51:59 -03:00
oobabooga
3d94ebfdd0 Change --chat colors 2023-02-26 00:51:15 -03:00
oobabooga
b3d2365d92 Rename a button 2023-02-25 16:33:46 -03:00
oobabooga
03d25c1c61 Reorder the chat buttons 2023-02-25 15:35:43 -03:00
oobabooga
e2cf4e4968 Reorder the custom parameters 2023-02-25 15:21:40 -03:00
oobabooga
381f747181 Reorganize the custom parameters for mobile usage 2023-02-25 15:17:44 -03:00
oobabooga
01acb250c5 Add a comment 2023-02-25 02:07:29 -03:00
oobabooga
7c2babfe39 Rename greed to "generation attempts" 2023-02-25 01:42:19 -03:00
oobabooga
2dfb999bf1 Add greed parameter 2023-02-25 01:31:01 -03:00
oobabooga
7a527a5581 Move "send picture" into an extension
I am not proud of how I did it for now.
2023-02-25 00:23:51 -03:00
oobabooga
e51ece21c0 Add ui() function to extensions 2023-02-24 19:00:11 -03:00
oobabooga
77f58e5dab Remove a space 2023-02-24 17:32:34 -03:00
oobabooga
c5066f1192 Rename some variables, be consistent about ' and " 2023-02-24 17:31:23 -03:00
oobabooga
78ad55641b Remove duplicate max_new_tokens parameter 2023-02-24 17:19:42 -03:00
oobabooga
65326b545a Move all gradio elements to shared (so that extensions can use them) 2023-02-24 16:46:50 -03:00
oobabooga
0a3590da8c Add a progress bar 2023-02-24 14:19:27 -03:00
oobabooga
3b8cecbab7 Reload the default chat on page refresh 2023-02-23 19:50:23 -03:00
oobabooga
f1914115d3 Fix minor issue with chat logs 2023-02-23 16:04:47 -03:00
oobabooga
2e86a1ec04 Move chat history into shared module 2023-02-23 15:11:18 -03:00
oobabooga
c87800341c Move function to extensions module 2023-02-23 14:55:21 -03:00
oobabooga
7224343a70 Improve the imports 2023-02-23 14:41:42 -03:00
oobabooga
364529d0c7 Further refactor 2023-02-23 14:31:28 -03:00
oobabooga
e46c43afa6 Move some stuff from server.py to modules 2023-02-23 13:42:23 -03:00
oobabooga
1dacd34165 Further refactor 2023-02-23 13:28:30 -03:00
oobabooga
ce7feb3641 Further refactor 2023-02-23 13:03:52 -03:00
oobabooga
98af4bfb0d Refactor the code to make it more modular 2023-02-23 12:05:25 -03:00
oobabooga
18e0ec955e Improve some descriptions in --help 2023-02-23 10:11:58 -03:00
oobabooga
c72892835a Don't show *-np models in the list of choices 2023-02-22 11:38:16 -03:00
oobabooga
044b963987 Add stop parameter for flexgen (#105) 2023-02-22 11:23:36 -03:00
oobabooga
ea21a22940 Remove redundant preset 2023-02-22 01:01:26 -03:00
oobabooga
b8b3d4139c Add --compress-weight parameter 2023-02-22 00:43:21 -03:00
oobabooga
eef6fc3cbf Add a preset for FlexGen 2023-02-21 23:33:15 -03:00
oobabooga
311404e258 Reuse disk-cache-dir parameter for flexgen 2023-02-21 22:11:05 -03:00
oobabooga
f3c75bbd64 Add --percent flag for flexgen 2023-02-21 22:08:46 -03:00
oobabooga
b83f51ee04 Add FlexGen support #92 (experimental) 2023-02-21 21:00:06 -03:00
oobabooga
444cd69c67 Fix regex bug in loading character jsons with special characters 2023-02-20 19:38:19 -03:00
oobabooga
d7a738fb7a Load any 13b/20b/30b model in 8-bit mode when no flags are supplied 2023-02-20 15:44:10 -03:00
oobabooga
77846ceef3 Minor change 2023-02-20 15:05:48 -03:00
oobabooga
e195377050 Deprecate torch dumps, move to safetensors (they load even faster) 2023-02-20 15:03:19 -03:00
oobabooga
14ffa0b418 Fix line breaks in --chat mode 2023-02-20 13:25:46 -03:00
SillyLossy
ded890c378 Escape regexp in message extraction 2023-02-19 12:55:45 +02:00
oobabooga
8c9dd95d55
Print the softprompt metadata when it is loaded 2023-02-19 01:48:23 -03:00
oobabooga
f79805f4a4
Change a comment 2023-02-18 22:58:40 -03:00
oobabooga
d58544a420 Some minor formatting changes 2023-02-18 11:07:55 -03:00
oobabooga
0dd41e4830 Reorganize the sliders some more 2023-02-17 16:33:27 -03:00
oobabooga
6b9ac2f88e Reorganize the generation parameters 2023-02-17 16:18:01 -03:00
oobabooga
596732a981 The soft prompt length must be considered here too 2023-02-17 12:35:30 -03:00
oobabooga
edc0262889 Minor file uploading fixes 2023-02-17 10:27:41 -03:00
oobabooga
243244eeec Attempt at fixing greyed out files on iphone 2023-02-17 10:17:15 -03:00
oobabooga
a226f4cddb No change, so reverting 2023-02-17 09:27:17 -03:00
oobabooga
40cb9f63f6 Try making Colab happy (tensorflow warnings) 2023-02-17 09:23:11 -03:00
oobabooga
aeddf902ec Make the refresh button prettier 2023-02-16 21:55:20 -03:00
oobabooga
21512e2790 Make the Stop button work more reliably 2023-02-16 21:21:45 -03:00
oobabooga
08805b3374 Force "You" in impersonate too 2023-02-16 13:24:13 -03:00
oobabooga
d7db04403f Fix --chat chatbox height 2023-02-16 12:45:05 -03:00
oobabooga
589069e105 Don't regenerate if no message has been sent 2023-02-16 12:32:35 -03:00
oobabooga
405dfbf57c Force your name to be "You" for pygmalion (properly) 2023-02-16 12:16:12 -03:00
oobabooga
7bd2ae05bf Force your name to be "You" for pygmalion
This allows you to customize your displayed name.
2023-02-15 21:32:53 -03:00
oobabooga
3746d72853 More style fixes 2023-02-15 21:13:12 -03:00
oobabooga
6f213b8c14 Style fix 2023-02-15 20:58:17 -03:00
oobabooga
ccf10db60f Move stuff into tabs in chat mode 2023-02-15 20:55:32 -03:00
oobabooga
a55e8836f6 Bump gradio version
It looks uglier, but the old one was bugged and unstable.
2023-02-15 20:20:56 -03:00
oobabooga
0e89ff4b13 Clear the persistent history after clicking on "Clear history" 2023-02-15 16:49:52 -03:00
oobabooga
b3bcd2881d Implement regenerate/impersonate the proper way (fixes #78) 2023-02-15 14:39:26 -03:00
oobabooga
5ee9283cae Mention BLIP 2023-02-15 13:53:38 -03:00
oobabooga
8d3b3959e7 Document --picture option 2023-02-15 13:50:18 -03:00
oobabooga
2eea0f4edb Minor change 2023-02-15 12:58:11 -03:00
oobabooga
3c31fa7079 Simplifications 2023-02-15 12:46:11 -03:00
oobabooga
80fbc584f7 Readability 2023-02-15 11:38:44 -03:00
oobabooga
b397bea387 Make chat history persistent 2023-02-15 11:30:38 -03:00
oobabooga
7be372829d Set chat prompt size in tokens 2023-02-15 10:18:50 -03:00
oobabooga
8c3ef58e00 Use BLIP directly + some simplifications 2023-02-14 23:55:46 -03:00
SillyLossy
a7d98f494a Use BLIP to send a picture to model 2023-02-15 01:38:21 +02:00
oobabooga
d910d435cd Consider the softprompt in the maximum prompt length calculation 2023-02-14 12:06:47 -03:00
oobabooga
8b3bb512ef Minor bug fix (soft prompt was being loaded twice) 2023-02-13 23:34:04 -03:00
oobabooga
7739a29524 Some simplifications 2023-02-13 18:48:32 -03:00
oobabooga
3277b751f5 Add softprompt support (for real this time)
Is this too much voodoo for our purposes?
2023-02-13 15:25:16 -03:00
oobabooga
aa1177ff15 Send last internal reply to input rather than visible 2023-02-13 03:29:23 -03:00
oobabooga
2c3abcf57a Add support for rosey/chip/joi instruct models 2023-02-12 09:46:34 -03:00
oobabooga
7ef7bba6e6 Add progress bar for model loading 2023-02-12 09:36:27 -03:00
oobabooga
5d3f15b915 Use the CPU if no GPU is detected 2023-02-11 23:17:06 -03:00