Commit Graph

356 Commits

Author SHA1 Message Date
Forkoz
3b62bd180d
Remove PTH extension from RWKV
When loading the current model was blank unless you typed it out.
2023-03-14 21:23:39 +00:00
Forkoz
f0f325eac1
Remove Json from loading
no more 20b tokenizer
2023-03-14 21:21:47 +00:00
oobabooga
72d207c098
Remove the chat API
It is not implemented, has not been tested, and this is causing confusion.
2023-03-14 16:31:27 -03:00
oobabooga
a95592fc56 Add back a progress indicator to --no-stream 2023-03-12 20:38:40 -03:00
oobabooga
bcf0075278
Merge pull request #235 from xanthousm/Quality_of_life-main
--auto-launch and "Is typing..."
2023-03-12 03:12:56 -03:00
oobabooga
92fe947721 Merge branch 'main' into new-streaming 2023-03-11 19:59:45 -03:00
oobabooga
2743dd736a Add *Is typing...* to impersonate as well 2023-03-11 10:50:18 -03:00
Xan
96c51973f9 --auto-launch and "Is typing..."
- Added `--auto-launch` arg to open web UI in the default browser when ready.
- Changed chat.py to display user input immediately and "*Is typing...*" as a temporary reply while generating text. Most noticeable when using `--no-stream`.
2023-03-11 22:50:59 +11:00
oobabooga
9849aac0f1 Don't show .pt models in the list 2023-03-09 21:54:50 -03:00
oobabooga
038e90765b Rename to "Text generation web UI" 2023-03-09 09:44:08 -03:00
jtang613
807a41cf87 Lets propose a name besides "Gradio" 2023-03-08 21:02:25 -05:00
oobabooga
ab50f80542 New text streaming method (much faster) 2023-03-08 02:46:35 -03:00
oobabooga
bf56b6c1fb Load settings.json without the need for --settings settings.json
This is for setting UI defaults
2023-03-06 10:57:45 -03:00
oobabooga
bcea196c9d Bump flexgen version 2023-03-02 12:03:57 -03:00
oobabooga
169209805d Model-aware prompts and presets 2023-03-02 11:25:04 -03:00
oobabooga
99dc95e14e Minor aesthetic change 2023-03-01 19:32:04 -03:00
oobabooga
a1429d1607 Add default extensions to the settings 2023-02-28 02:20:11 -03:00
oobabooga
365e1089b3 Move some buttons 2023-02-28 01:34:07 -03:00
oobabooga
43b6ab8673 Store thumbnails as files instead of base64 strings
This improves the UI responsiveness for large histories.
2023-02-27 13:41:00 -03:00
oobabooga
611010e8af Add a confirmation to clear history 2023-02-27 11:41:21 -03:00
oobabooga
7a776ccf87 Make the gallery interactive to load characters 2023-02-26 17:19:36 -03:00
oobabooga
e91eb24649 Decrease the repetition penalty upper limit to 3 2023-02-26 01:51:59 -03:00
oobabooga
3d94ebfdd0 Change --chat colors 2023-02-26 00:51:15 -03:00
oobabooga
b3d2365d92 Rename a button 2023-02-25 16:33:46 -03:00
oobabooga
03d25c1c61 Reorder the chat buttons 2023-02-25 15:35:43 -03:00
oobabooga
e2cf4e4968 Reorder the custom parameters 2023-02-25 15:21:40 -03:00
oobabooga
381f747181 Reorganize the custom parameters for mobile usage 2023-02-25 15:17:44 -03:00
oobabooga
01acb250c5 Add a comment 2023-02-25 02:07:29 -03:00
oobabooga
7c2babfe39 Rename greed to "generation attempts" 2023-02-25 01:42:19 -03:00
oobabooga
2dfb999bf1 Add greed parameter 2023-02-25 01:31:01 -03:00
oobabooga
7a527a5581 Move "send picture" into an extension
I am not proud of how I did it for now.
2023-02-25 00:23:51 -03:00
oobabooga
e51ece21c0 Add ui() function to extensions 2023-02-24 19:00:11 -03:00
oobabooga
77f58e5dab Remove a space 2023-02-24 17:32:34 -03:00
oobabooga
c5066f1192 Rename some variables, be consistent about ' and " 2023-02-24 17:31:23 -03:00
oobabooga
78ad55641b Remove duplicate max_new_tokens parameter 2023-02-24 17:19:42 -03:00
oobabooga
65326b545a Move all gradio elements to shared (so that extensions can use them) 2023-02-24 16:46:50 -03:00
oobabooga
0a3590da8c Add a progress bar 2023-02-24 14:19:27 -03:00
oobabooga
3b8cecbab7 Reload the default chat on page refresh 2023-02-23 19:50:23 -03:00
oobabooga
f1914115d3 Fix minor issue with chat logs 2023-02-23 16:04:47 -03:00
oobabooga
2e86a1ec04 Move chat history into shared module 2023-02-23 15:11:18 -03:00
oobabooga
c87800341c Move function to extensions module 2023-02-23 14:55:21 -03:00
oobabooga
7224343a70 Improve the imports 2023-02-23 14:41:42 -03:00
oobabooga
364529d0c7 Further refactor 2023-02-23 14:31:28 -03:00
oobabooga
e46c43afa6 Move some stuff from server.py to modules 2023-02-23 13:42:23 -03:00
oobabooga
1dacd34165 Further refactor 2023-02-23 13:28:30 -03:00
oobabooga
ce7feb3641 Further refactor 2023-02-23 13:03:52 -03:00
oobabooga
98af4bfb0d Refactor the code to make it more modular 2023-02-23 12:05:25 -03:00
oobabooga
18e0ec955e Improve some descriptions in --help 2023-02-23 10:11:58 -03:00
oobabooga
c72892835a Don't show *-np models in the list of choices 2023-02-22 11:38:16 -03:00
oobabooga
044b963987 Add stop parameter for flexgen (#105) 2023-02-22 11:23:36 -03:00
oobabooga
ea21a22940 Remove redundant preset 2023-02-22 01:01:26 -03:00
oobabooga
b8b3d4139c Add --compress-weight parameter 2023-02-22 00:43:21 -03:00
oobabooga
eef6fc3cbf Add a preset for FlexGen 2023-02-21 23:33:15 -03:00
oobabooga
311404e258 Reuse disk-cache-dir parameter for flexgen 2023-02-21 22:11:05 -03:00
oobabooga
f3c75bbd64 Add --percent flag for flexgen 2023-02-21 22:08:46 -03:00
oobabooga
b83f51ee04 Add FlexGen support #92 (experimental) 2023-02-21 21:00:06 -03:00
oobabooga
444cd69c67 Fix regex bug in loading character jsons with special characters 2023-02-20 19:38:19 -03:00
oobabooga
d7a738fb7a Load any 13b/20b/30b model in 8-bit mode when no flags are supplied 2023-02-20 15:44:10 -03:00
oobabooga
77846ceef3 Minor change 2023-02-20 15:05:48 -03:00
oobabooga
e195377050 Deprecate torch dumps, move to safetensors (they load even faster) 2023-02-20 15:03:19 -03:00
oobabooga
14ffa0b418 Fix line breaks in --chat mode 2023-02-20 13:25:46 -03:00
SillyLossy
ded890c378 Escape regexp in message extraction 2023-02-19 12:55:45 +02:00
oobabooga
8c9dd95d55
Print the softprompt metadata when it is loaded 2023-02-19 01:48:23 -03:00
oobabooga
f79805f4a4
Change a comment 2023-02-18 22:58:40 -03:00
oobabooga
d58544a420 Some minor formatting changes 2023-02-18 11:07:55 -03:00
oobabooga
0dd41e4830 Reorganize the sliders some more 2023-02-17 16:33:27 -03:00
oobabooga
6b9ac2f88e Reorganize the generation parameters 2023-02-17 16:18:01 -03:00
oobabooga
596732a981 The soft prompt length must be considered here too 2023-02-17 12:35:30 -03:00
oobabooga
edc0262889 Minor file uploading fixes 2023-02-17 10:27:41 -03:00
oobabooga
243244eeec Attempt at fixing greyed out files on iphone 2023-02-17 10:17:15 -03:00
oobabooga
a226f4cddb No change, so reverting 2023-02-17 09:27:17 -03:00
oobabooga
40cb9f63f6 Try making Colab happy (tensorflow warnings) 2023-02-17 09:23:11 -03:00
oobabooga
aeddf902ec Make the refresh button prettier 2023-02-16 21:55:20 -03:00
oobabooga
21512e2790 Make the Stop button work more reliably 2023-02-16 21:21:45 -03:00
oobabooga
08805b3374 Force "You" in impersonate too 2023-02-16 13:24:13 -03:00
oobabooga
d7db04403f Fix --chat chatbox height 2023-02-16 12:45:05 -03:00
oobabooga
589069e105 Don't regenerate if no message has been sent 2023-02-16 12:32:35 -03:00
oobabooga
405dfbf57c Force your name to be "You" for pygmalion (properly) 2023-02-16 12:16:12 -03:00
oobabooga
7bd2ae05bf Force your name to be "You" for pygmalion
This allows you to customize your displayed name.
2023-02-15 21:32:53 -03:00
oobabooga
3746d72853 More style fixes 2023-02-15 21:13:12 -03:00
oobabooga
6f213b8c14 Style fix 2023-02-15 20:58:17 -03:00
oobabooga
ccf10db60f Move stuff into tabs in chat mode 2023-02-15 20:55:32 -03:00
oobabooga
a55e8836f6 Bump gradio version
It looks uglier, but the old one was bugged and unstable.
2023-02-15 20:20:56 -03:00
oobabooga
0e89ff4b13 Clear the persistent history after clicking on "Clear history" 2023-02-15 16:49:52 -03:00
oobabooga
b3bcd2881d Implement regenerate/impersonate the proper way (fixes #78) 2023-02-15 14:39:26 -03:00
oobabooga
5ee9283cae Mention BLIP 2023-02-15 13:53:38 -03:00
oobabooga
8d3b3959e7 Document --picture option 2023-02-15 13:50:18 -03:00
oobabooga
2eea0f4edb Minor change 2023-02-15 12:58:11 -03:00
oobabooga
3c31fa7079 Simplifications 2023-02-15 12:46:11 -03:00
oobabooga
80fbc584f7 Readability 2023-02-15 11:38:44 -03:00
oobabooga
b397bea387 Make chat history persistent 2023-02-15 11:30:38 -03:00
oobabooga
7be372829d Set chat prompt size in tokens 2023-02-15 10:18:50 -03:00
oobabooga
8c3ef58e00 Use BLIP directly + some simplifications 2023-02-14 23:55:46 -03:00
SillyLossy
a7d98f494a Use BLIP to send a picture to model 2023-02-15 01:38:21 +02:00
oobabooga
d910d435cd Consider the softprompt in the maximum prompt length calculation 2023-02-14 12:06:47 -03:00
oobabooga
8b3bb512ef Minor bug fix (soft prompt was being loaded twice) 2023-02-13 23:34:04 -03:00
oobabooga
7739a29524 Some simplifications 2023-02-13 18:48:32 -03:00
oobabooga
3277b751f5 Add softprompt support (for real this time)
Is this too much voodoo for our purposes?
2023-02-13 15:25:16 -03:00
oobabooga
aa1177ff15 Send last internal reply to input rather than visible 2023-02-13 03:29:23 -03:00
oobabooga
2c3abcf57a Add support for rosey/chip/joi instruct models 2023-02-12 09:46:34 -03:00