81300
|
20dbef9623
|
Extend bfloat16 support
|
2023-02-09 20:00:03 +02:00 |
|
oobabooga
|
cadd100405
|
min_length has to be 0 when streaming is on
|
2023-02-08 00:23:35 -03:00 |
|
oobabooga
|
6be571cff7
|
Better variable names
|
2023-02-08 00:19:20 -03:00 |
|
oobabooga
|
58b07cca81
|
length_penalty can be negative (apparently)
|
2023-02-07 23:33:02 -03:00 |
|
oobabooga
|
7e4c25691d
|
Repetition penalty has to be < 5
|
2023-02-07 23:23:39 -03:00 |
|
oobabooga
|
1c30e1b49a
|
Add even more sliders
|
2023-02-07 23:11:04 -03:00 |
|
oobabooga
|
24dc705eca
|
Add lots of sliders
|
2023-02-07 22:08:21 -03:00 |
|
Martin J
|
06a4664805
|
Fix a regex issue in tokenize_dialogue .
The existing regex would fail if using character names that start with
numbers, for example: 9S or 2B.
|
2023-02-05 07:42:57 +01:00 |
|
oobabooga
|
2fe235738e
|
Reorganize chat buttons
|
2023-02-04 22:53:42 -03:00 |
|
oobabooga
|
2207d44986
|
Windows doesn't like : in filenames
|
2023-02-04 20:07:39 -03:00 |
|
oobabooga
|
65266f3349
|
Fix loading official colab chat logs
|
2023-02-03 22:43:02 -03:00 |
|
oobabooga
|
44e8c671f9
|
Fix API documentation formatting in chat mode
|
2023-02-03 10:00:05 -03:00 |
|
oobabooga
|
a28f0d8bd7
|
Show it/s in the same units with or without streaming
Closes #49
|
2023-02-03 09:11:11 -03:00 |
|
oobabooga
|
4e4cd67223
|
Save chat history with name/date in filename
closes #50
|
2023-02-03 09:02:35 -03:00 |
|
oobabooga
|
3af3ffeb90
|
Make --help output more readable
|
2023-02-02 23:36:28 -03:00 |
|
oobabooga
|
638495b633
|
Simplify generate() function
|
2023-02-02 13:47:08 -03:00 |
|
oobabooga
|
3f05cf5ddd
|
Simplify encode() function
|
2023-02-02 13:31:32 -03:00 |
|
oobabooga
|
2583bc5840
|
Simplify deepspeed implementation (#40)
|
2023-02-02 12:15:44 -03:00 |
|
oobabooga
|
f38c9bf428
|
Fix deepspeed (oops)
|
2023-02-02 10:39:37 -03:00 |
|
oobabooga
|
90f1067598
|
Move deepspeed parameters to another file
|
2023-02-02 10:25:09 -03:00 |
|
81300
|
248ec4fa21
|
Merge branch 'oobabooga:main' into ds
|
2023-02-01 20:50:51 +02:00 |
|
81300
|
a6f4760772
|
Add arg for bfloat16
|
2023-02-01 20:22:07 +02:00 |
|
81300
|
c515282f5c
|
no_split_module_classes not needed
|
2023-02-01 19:47:26 +02:00 |
|
81300
|
0a0d289537
|
Fix issue with generating on multiple GPUs
|
2023-02-01 19:02:07 +02:00 |
|
81300
|
a97afa6965
|
Add DeepSpeed ZeRO-3 integration
|
2023-02-01 18:48:13 +02:00 |
|
oobabooga
|
6b13816c47
|
Change default --disk behavior
|
2023-02-01 10:43:28 -03:00 |
|
oobabooga
|
119be56390
|
Add back low_cpu_mem_usage=True
Removing it didn't help with anything, so I am adding it bad on a purely
superstiticious basis.
|
2023-02-01 10:01:44 -03:00 |
|
oobabooga
|
d4a0b377ab
|
Allow standalone --cpu-memory
I think that what I am doing probably makes sense, but I could be wrong.
|
2023-01-31 21:23:16 -03:00 |
|
oobabooga
|
8ef89df746
|
Try to leave at least 1GiB free to prevent oom errors
|
2023-01-31 20:47:05 -03:00 |
|
oobabooga
|
bb77f20a6c
|
Don't use low_cpu_mem_usage and device_map together
|
2023-01-31 13:24:05 -03:00 |
|
oobabooga
|
001ecf95b2
|
Update server.py
|
2023-01-31 08:14:16 -03:00 |
|
Silver267
|
a85bb5e9a2
|
Fix an error
Fixes "UnboundLocalError: local variable 'substring_found' referenced before assignment" when loading non-pygmalion models in cai chat mode.
|
2023-01-31 01:34:10 -05:00 |
|
oobabooga
|
5b0bbfa6e8
|
Clean up
|
2023-01-30 14:17:12 -03:00 |
|
oobabooga
|
2dadf42cb5
|
Print the tokenized example dialogue in a prettier way
|
2023-01-30 08:29:49 -03:00 |
|
oobabooga
|
161cae001b
|
I needed this
|
2023-01-29 23:20:22 -03:00 |
|
oobabooga
|
3ebca480f6
|
Minor fix
|
2023-01-29 23:05:17 -03:00 |
|
oobabooga
|
00707a0b3b
|
Add "Impersonate" button
|
2023-01-29 22:56:23 -03:00 |
|
oobabooga
|
de72e83508
|
Reorganize things
|
2023-01-29 14:27:22 -03:00 |
|
oobabooga
|
6fbfee9e6d
|
Remove some bloat
|
2023-01-29 12:05:18 -03:00 |
|
oobabooga
|
9c9bd1074f
|
Add option to replace the bot's last reply
|
2023-01-29 12:02:44 -03:00 |
|
oobabooga
|
e5ff4ddfc8
|
Add bot prefix modifier option in extensions
|
2023-01-29 10:11:59 -03:00 |
|
oobabooga
|
b6d01bb704
|
Enable extensions in all modes, not just chat
|
2023-01-29 09:48:18 -03:00 |
|
oobabooga
|
1a139664f5
|
Grammar
|
2023-01-29 02:54:36 -03:00 |
|
oobabooga
|
2d134031ca
|
Apply extensions to character greeting
|
2023-01-29 00:04:11 -03:00 |
|
oobabooga
|
e349b52256
|
Read extensions parameters from settings file
|
2023-01-28 23:21:40 -03:00 |
|
oobabooga
|
2239be2351
|
Support for number/bool extension parameters
|
2023-01-28 23:08:28 -03:00 |
|
oobabooga
|
6da94e358c
|
Add support for extensions parameters
Still experimental
|
2023-01-28 23:00:51 -03:00 |
|
oobabooga
|
e779fd795f
|
Save TavernAI characters with TavernAI- prefix
|
2023-01-28 21:01:56 -03:00 |
|
oobabooga
|
833a1138fa
|
Explain the dialogue tokenization output
|
2023-01-28 20:41:02 -03:00 |
|
oobabooga
|
545b7395b2
|
Prevent huge --help outputs
|
2023-01-28 20:36:51 -03:00 |
|
oobabooga
|
f4c455ce29
|
Merge pull request #30 from 10sa/patch-1
Add listening port options for listening mode.
|
2023-01-28 20:35:20 -03:00 |
|
oobabooga
|
7b283a4a3d
|
Update server.py
|
2023-01-28 20:35:05 -03:00 |
|
oobabooga
|
f4674d34a9
|
Reorganize chat UI elements
|
2023-01-28 20:28:08 -03:00 |
|
oobabooga
|
3687962e6c
|
Add support for TavernAI character cards (closes #31)
|
2023-01-28 20:18:23 -03:00 |
|
oobabooga
|
f71531186b
|
Upload profile pictures from the web UI
|
2023-01-28 19:16:37 -03:00 |
|
Tensa
|
3742d3b18a
|
Add listening port options for listening mode.
|
2023-01-28 03:38:34 +09:00 |
|
oobabooga
|
69ffef4391
|
History loading minor bug fix
|
2023-01-27 12:01:11 -03:00 |
|
oobabooga
|
8b8236c6ff
|
Fix Regenerate button bug
|
2023-01-27 11:14:19 -03:00 |
|
oobabooga
|
1d1f931757
|
Load extensions at startup
|
2023-01-27 10:53:05 -03:00 |
|
oobabooga
|
70e034589f
|
Update the export/load chat history functions
|
2023-01-27 02:16:05 -03:00 |
|
oobabooga
|
6b5dcd46c5
|
Add support for extensions
This is experimental.
|
2023-01-27 00:40:39 -03:00 |
|
oobabooga
|
e69990e37b
|
Change order of upload and download tabs in chat mode
|
2023-01-26 16:57:12 -03:00 |
|
oobabooga
|
ac6065d5ed
|
Fix character loading bug
|
2023-01-26 13:45:19 -03:00 |
|
oobabooga
|
61611197e0
|
Add --verbose option (oops)
|
2023-01-26 02:18:06 -03:00 |
|
oobabooga
|
abc920752f
|
Stop at eos_token while streaming text (for #26)
|
2023-01-25 22:27:04 -03:00 |
|
oobabooga
|
b77933d327
|
File names must be img_me.jpg and img_bot.jpg
|
2023-01-25 19:40:30 -03:00 |
|
oobabooga
|
fc73188ec7
|
Allow specifying your own profile picture in chat mode
|
2023-01-25 19:37:44 -03:00 |
|
oobabooga
|
3fa14befc5
|
Bump the gradio version, add back the queue
|
2023-01-25 16:10:35 -03:00 |
|
oobabooga
|
7a3717b824
|
Allow uploading characters
|
2023-01-25 15:45:25 -03:00 |
|
oobabooga
|
6388c7fbc0
|
Set queue size to 1 to prevent gradio undefined behavior
|
2023-01-25 14:37:41 -03:00 |
|
oobabooga
|
ec69c190ba
|
Keep the character's greeting/example dialogue when "clear history" is clicked
|
2023-01-25 10:52:35 -03:00 |
|
oobabooga
|
ebed1dea56
|
Generate 8 tokens at a time in streaming mode instead of just 1
This is a performance optimization.
|
2023-01-25 10:38:26 -03:00 |
|
oobabooga
|
3b8f0021cc
|
Stop generating at \nYou: in chat mode
|
2023-01-25 10:17:55 -03:00 |
|
oobabooga
|
54e77acac4
|
Rename to "Generation parameters preset" for clarity
|
2023-01-23 20:49:44 -03:00 |
|
oobabooga
|
ce4756fb88
|
Allow uploading chat history in official pygmalion web ui format
|
2023-01-23 15:29:01 -03:00 |
|
oobabooga
|
8325e23923
|
Fix bug in loading chat history as text file
|
2023-01-23 14:28:02 -03:00 |
|
oobabooga
|
059d47edb5
|
Submit with enter instead of shift+enter in chat mode
|
2023-01-23 14:04:01 -03:00 |
|
oobabooga
|
4820379139
|
Add debug preset (deterministic, should always give the same responses)
|
2023-01-23 13:36:01 -03:00 |
|
oobabooga
|
947b50e8ea
|
Allow uploading chat history as simple text files
|
2023-01-23 09:45:10 -03:00 |
|
oobabooga
|
ebf720585b
|
Mention time and it/s in terminal with streaming off
|
2023-01-22 20:07:19 -03:00 |
|
oobabooga
|
d87310ad61
|
Send last input to the input box when "Remove last" is clicked
|
2023-01-22 19:40:22 -03:00 |
|
oobabooga
|
d0ea6d5f86
|
Make the maximum history size in prompt unlimited by default
|
2023-01-22 17:17:35 -03:00 |
|
oobabooga
|
00f3b0996b
|
Warn the user that chat mode becomes a lot slower with text streaming
|
2023-01-22 16:19:11 -03:00 |
|
oobabooga
|
c5cc3a3075
|
Fix bug in "remove last" button
|
2023-01-22 13:10:36 -03:00 |
|
oobabooga
|
a410cf1345
|
Mention that "Chat history size" means "Chat history size in prompt"
|
2023-01-22 03:15:35 -03:00 |
|
oobabooga
|
b3e1a874bc
|
Fix bug in loading history
|
2023-01-22 02:32:54 -03:00 |
|
oobabooga
|
62b533f344
|
Add "regenerate" button to the chat
|
2023-01-22 02:19:58 -03:00 |
|
oobabooga
|
94ecbc6dff
|
Export history as nicely formatted json
|
2023-01-22 01:24:16 -03:00 |
|
oobabooga
|
deacb96c34
|
Change the pygmalion default context
|
2023-01-22 00:49:59 -03:00 |
|
oobabooga
|
23f94f559a
|
Improve the chat prompt design
|
2023-01-22 00:35:42 -03:00 |
|
oobabooga
|
139e2f0ab4
|
Redesign the upload/download chat history buttons
|
2023-01-22 00:22:50 -03:00 |
|
oobabooga
|
434d4b128c
|
Add refresh buttons for the model/preset/character menus
|
2023-01-22 00:02:46 -03:00 |
|
oobabooga
|
1e5e56fa2e
|
Better recognize the 4chan model (for #19)
|
2023-01-21 22:13:01 -03:00 |
|
oobabooga
|
aadf4e899a
|
Improve example dialogue handling
|
2023-01-21 15:04:13 -03:00 |
|
oobabooga
|
f9dbe7e08e
|
Update README
|
2023-01-21 03:05:55 -03:00 |
|
oobabooga
|
27e2d932b0
|
Don't export include the example dialogue in the export json
|
2023-01-21 02:55:13 -03:00 |
|
oobabooga
|
990ee54ddd
|
Move the example dialogue to the chat history, and keep it hidden.
This greatly improves the performance of text generation, as
histories can be quite long. It also makes more sense to implement
it this way.
|
2023-01-21 02:48:06 -03:00 |
|
oobabooga
|
d7299df01f
|
Rename parameters
|
2023-01-21 00:33:41 -03:00 |
|
oobabooga
|
5df03bf0fd
|
Merge branch 'main' into main
|
2023-01-21 00:25:34 -03:00 |
|
oobabooga
|
faaafe7c0e
|
Better parameter naming
|
2023-01-20 23:45:16 -03:00 |
|