oobabooga
|
54e77acac4
|
Rename to "Generation parameters preset" for clarity
|
2023-01-23 20:49:44 -03:00 |
|
oobabooga
|
ce4756fb88
|
Allow uploading chat history in official pygmalion web ui format
|
2023-01-23 15:29:01 -03:00 |
|
oobabooga
|
8325e23923
|
Fix bug in loading chat history as text file
|
2023-01-23 14:28:02 -03:00 |
|
oobabooga
|
059d47edb5
|
Submit with enter instead of shift+enter in chat mode
|
2023-01-23 14:04:01 -03:00 |
|
oobabooga
|
4820379139
|
Add debug preset (deterministic, should always give the same responses)
|
2023-01-23 13:36:01 -03:00 |
|
oobabooga
|
947b50e8ea
|
Allow uploading chat history as simple text files
|
2023-01-23 09:45:10 -03:00 |
|
oobabooga
|
ebf720585b
|
Mention time and it/s in terminal with streaming off
|
2023-01-22 20:07:19 -03:00 |
|
oobabooga
|
d87310ad61
|
Send last input to the input box when "Remove last" is clicked
|
2023-01-22 19:40:22 -03:00 |
|
oobabooga
|
d0ea6d5f86
|
Make the maximum history size in prompt unlimited by default
|
2023-01-22 17:17:35 -03:00 |
|
oobabooga
|
00f3b0996b
|
Warn the user that chat mode becomes a lot slower with text streaming
|
2023-01-22 16:19:11 -03:00 |
|
oobabooga
|
c5cc3a3075
|
Fix bug in "remove last" button
|
2023-01-22 13:10:36 -03:00 |
|
oobabooga
|
a410cf1345
|
Mention that "Chat history size" means "Chat history size in prompt"
|
2023-01-22 03:15:35 -03:00 |
|
oobabooga
|
b3e1a874bc
|
Fix bug in loading history
|
2023-01-22 02:32:54 -03:00 |
|
oobabooga
|
62b533f344
|
Add "regenerate" button to the chat
|
2023-01-22 02:19:58 -03:00 |
|
oobabooga
|
94ecbc6dff
|
Export history as nicely formatted json
|
2023-01-22 01:24:16 -03:00 |
|
oobabooga
|
deacb96c34
|
Change the pygmalion default context
|
2023-01-22 00:49:59 -03:00 |
|
oobabooga
|
23f94f559a
|
Improve the chat prompt design
|
2023-01-22 00:35:42 -03:00 |
|
oobabooga
|
139e2f0ab4
|
Redesign the upload/download chat history buttons
|
2023-01-22 00:22:50 -03:00 |
|
oobabooga
|
434d4b128c
|
Add refresh buttons for the model/preset/character menus
|
2023-01-22 00:02:46 -03:00 |
|
oobabooga
|
1e5e56fa2e
|
Better recognize the 4chan model (for #19)
|
2023-01-21 22:13:01 -03:00 |
|
oobabooga
|
aadf4e899a
|
Improve example dialogue handling
|
2023-01-21 15:04:13 -03:00 |
|
oobabooga
|
f9dbe7e08e
|
Update README
|
2023-01-21 03:05:55 -03:00 |
|
oobabooga
|
27e2d932b0
|
Don't export include the example dialogue in the export json
|
2023-01-21 02:55:13 -03:00 |
|
oobabooga
|
990ee54ddd
|
Move the example dialogue to the chat history, and keep it hidden.
This greatly improves the performance of text generation, as
histories can be quite long. It also makes more sense to implement
it this way.
|
2023-01-21 02:48:06 -03:00 |
|
oobabooga
|
d7299df01f
|
Rename parameters
|
2023-01-21 00:33:41 -03:00 |
|
oobabooga
|
5df03bf0fd
|
Merge branch 'main' into main
|
2023-01-21 00:25:34 -03:00 |
|
oobabooga
|
faaafe7c0e
|
Better parameter naming
|
2023-01-20 23:45:16 -03:00 |
|
Silver267
|
f4634e4c32
|
Update.
|
2023-01-20 17:05:43 -05:00 |
|
oobabooga
|
c0f2367b54
|
Minor fix
|
2023-01-20 17:09:25 -03:00 |
|
oobabooga
|
185587a33e
|
Add a history size parameter to the chat
If too many messages are used in the prompt, the model
gets really slow. It is useful to have the ability to
limit this.
|
2023-01-20 17:03:09 -03:00 |
|
oobabooga
|
78d5a999e6
|
Improve prompt formatation
|
2023-01-20 01:54:38 -03:00 |
|
oobabooga
|
70ff685736
|
Encode the input string correctly
|
2023-01-20 00:45:02 -03:00 |
|
oobabooga
|
b66d18d5a0
|
Allow presets/characters with '.' in their names
|
2023-01-19 21:56:33 -03:00 |
|
oobabooga
|
11c3214981
|
Fix some regexes
|
2023-01-19 19:59:34 -03:00 |
|
oobabooga
|
e61138bdad
|
Minor fixes
|
2023-01-19 19:04:54 -03:00 |
|
oobabooga
|
2181fca709
|
Better defaults for chat
|
2023-01-19 18:58:45 -03:00 |
|
oobabooga
|
83808171d3
|
Add --share option for Colab
|
2023-01-19 17:31:29 -03:00 |
|
oobabooga
|
8d788874d7
|
Add support for characters
|
2023-01-19 16:46:46 -03:00 |
|
oobabooga
|
3121f4788e
|
Fix uploading chat log in --chat mode
|
2023-01-19 15:05:42 -03:00 |
|
oobabooga
|
849e4c7f90
|
Better way of finding the generated reply in the output string
|
2023-01-19 14:57:01 -03:00 |
|
oobabooga
|
d03b0ad7a8
|
Implement saving/loading chat logs (#9)
|
2023-01-19 14:03:47 -03:00 |
|
oobabooga
|
39bfea5a22
|
Add a progress bar
|
2023-01-19 12:20:57 -03:00 |
|
oobabooga
|
5390fc87c8
|
add auto-devices when disk is used
|
2023-01-19 12:11:44 -03:00 |
|
oobabooga
|
759da435e3
|
Release 8-bit models memory
|
2023-01-19 12:03:16 -03:00 |
|
oobabooga
|
7ace04864a
|
Implement sending layers to disk with --disk (#10)
|
2023-01-19 11:09:24 -03:00 |
|
oobabooga
|
93fa9bbe01
|
Clean up the streaming implementation
|
2023-01-19 10:43:05 -03:00 |
|
oobabooga
|
c90310e40e
|
Small simplification
|
2023-01-19 00:41:57 -03:00 |
|
oobabooga
|
99536ef5bf
|
Add no-stream option
|
2023-01-18 23:56:42 -03:00 |
|
oobabooga
|
116299b3ad
|
Manual eos_token implementation
|
2023-01-18 22:57:39 -03:00 |
|
oobabooga
|
3cb30bed0a
|
Add a "stop" button
|
2023-01-18 22:44:47 -03:00 |
|