oobabooga
|
40cb9f63f6
|
Try making Colab happy (tensorflow warnings)
|
2023-02-17 09:23:11 -03:00 |
|
oobabooga
|
aeddf902ec
|
Make the refresh button prettier
|
2023-02-16 21:55:20 -03:00 |
|
oobabooga
|
21512e2790
|
Make the Stop button work more reliably
|
2023-02-16 21:21:45 -03:00 |
|
oobabooga
|
08805b3374
|
Force "You" in impersonate too
|
2023-02-16 13:24:13 -03:00 |
|
oobabooga
|
d7db04403f
|
Fix --chat chatbox height
|
2023-02-16 12:45:05 -03:00 |
|
oobabooga
|
589069e105
|
Don't regenerate if no message has been sent
|
2023-02-16 12:32:35 -03:00 |
|
oobabooga
|
405dfbf57c
|
Force your name to be "You" for pygmalion (properly)
|
2023-02-16 12:16:12 -03:00 |
|
oobabooga
|
7bd2ae05bf
|
Force your name to be "You" for pygmalion
This allows you to customize your displayed name.
|
2023-02-15 21:32:53 -03:00 |
|
oobabooga
|
3746d72853
|
More style fixes
|
2023-02-15 21:13:12 -03:00 |
|
oobabooga
|
6f213b8c14
|
Style fix
|
2023-02-15 20:58:17 -03:00 |
|
oobabooga
|
ccf10db60f
|
Move stuff into tabs in chat mode
|
2023-02-15 20:55:32 -03:00 |
|
oobabooga
|
a55e8836f6
|
Bump gradio version
It looks uglier, but the old one was bugged and unstable.
|
2023-02-15 20:20:56 -03:00 |
|
oobabooga
|
0e89ff4b13
|
Clear the persistent history after clicking on "Clear history"
|
2023-02-15 16:49:52 -03:00 |
|
oobabooga
|
b3bcd2881d
|
Implement regenerate/impersonate the proper way (fixes #78)
|
2023-02-15 14:39:26 -03:00 |
|
oobabooga
|
5ee9283cae
|
Mention BLIP
|
2023-02-15 13:53:38 -03:00 |
|
oobabooga
|
8d3b3959e7
|
Document --picture option
|
2023-02-15 13:50:18 -03:00 |
|
oobabooga
|
2eea0f4edb
|
Minor change
|
2023-02-15 12:58:11 -03:00 |
|
oobabooga
|
3c31fa7079
|
Simplifications
|
2023-02-15 12:46:11 -03:00 |
|
oobabooga
|
80fbc584f7
|
Readability
|
2023-02-15 11:38:44 -03:00 |
|
oobabooga
|
b397bea387
|
Make chat history persistent
|
2023-02-15 11:30:38 -03:00 |
|
oobabooga
|
7be372829d
|
Set chat prompt size in tokens
|
2023-02-15 10:18:50 -03:00 |
|
oobabooga
|
8c3ef58e00
|
Use BLIP directly + some simplifications
|
2023-02-14 23:55:46 -03:00 |
|
SillyLossy
|
a7d98f494a
|
Use BLIP to send a picture to model
|
2023-02-15 01:38:21 +02:00 |
|
oobabooga
|
d910d435cd
|
Consider the softprompt in the maximum prompt length calculation
|
2023-02-14 12:06:47 -03:00 |
|
oobabooga
|
8b3bb512ef
|
Minor bug fix (soft prompt was being loaded twice)
|
2023-02-13 23:34:04 -03:00 |
|
oobabooga
|
7739a29524
|
Some simplifications
|
2023-02-13 18:48:32 -03:00 |
|
oobabooga
|
3277b751f5
|
Add softprompt support (for real this time)
Is this too much voodoo for our purposes?
|
2023-02-13 15:25:16 -03:00 |
|
oobabooga
|
aa1177ff15
|
Send last internal reply to input rather than visible
|
2023-02-13 03:29:23 -03:00 |
|
oobabooga
|
2c3abcf57a
|
Add support for rosey/chip/joi instruct models
|
2023-02-12 09:46:34 -03:00 |
|
oobabooga
|
7ef7bba6e6
|
Add progress bar for model loading
|
2023-02-12 09:36:27 -03:00 |
|
oobabooga
|
5d3f15b915
|
Use the CPU if no GPU is detected
|
2023-02-11 23:17:06 -03:00 |
|
oobabooga
|
b3c4657c47
|
Remove commas from preset files
|
2023-02-11 14:54:29 -03:00 |
|
oobabooga
|
0dd1409f24
|
Add penalty_alpha parameter (contrastive search)
|
2023-02-11 14:48:12 -03:00 |
|
oobabooga
|
2ed0386d87
|
Fix replace last reply in --chat mode (for #69)
|
2023-02-11 07:59:54 -03:00 |
|
oobabooga
|
316e07f06a
|
auto-assign gpu memory with --auto-devices alone
|
2023-02-10 16:36:06 -03:00 |
|
oobabooga
|
219366342b
|
Sort imports according to PEP8 (based on #67)
|
2023-02-10 15:40:03 -03:00 |
|
81300
|
20dbef9623
|
Extend bfloat16 support
|
2023-02-09 20:00:03 +02:00 |
|
oobabooga
|
cadd100405
|
min_length has to be 0 when streaming is on
|
2023-02-08 00:23:35 -03:00 |
|
oobabooga
|
6be571cff7
|
Better variable names
|
2023-02-08 00:19:20 -03:00 |
|
oobabooga
|
58b07cca81
|
length_penalty can be negative (apparently)
|
2023-02-07 23:33:02 -03:00 |
|
oobabooga
|
7e4c25691d
|
Repetition penalty has to be < 5
|
2023-02-07 23:23:39 -03:00 |
|
oobabooga
|
1c30e1b49a
|
Add even more sliders
|
2023-02-07 23:11:04 -03:00 |
|
oobabooga
|
24dc705eca
|
Add lots of sliders
|
2023-02-07 22:08:21 -03:00 |
|
Martin J
|
06a4664805
|
Fix a regex issue in tokenize_dialogue .
The existing regex would fail if using character names that start with
numbers, for example: 9S or 2B.
|
2023-02-05 07:42:57 +01:00 |
|
oobabooga
|
2fe235738e
|
Reorganize chat buttons
|
2023-02-04 22:53:42 -03:00 |
|
oobabooga
|
2207d44986
|
Windows doesn't like : in filenames
|
2023-02-04 20:07:39 -03:00 |
|
oobabooga
|
65266f3349
|
Fix loading official colab chat logs
|
2023-02-03 22:43:02 -03:00 |
|
oobabooga
|
44e8c671f9
|
Fix API documentation formatting in chat mode
|
2023-02-03 10:00:05 -03:00 |
|
oobabooga
|
a28f0d8bd7
|
Show it/s in the same units with or without streaming
Closes #49
|
2023-02-03 09:11:11 -03:00 |
|
oobabooga
|
4e4cd67223
|
Save chat history with name/date in filename
closes #50
|
2023-02-03 09:02:35 -03:00 |
|
oobabooga
|
3af3ffeb90
|
Make --help output more readable
|
2023-02-02 23:36:28 -03:00 |
|
oobabooga
|
638495b633
|
Simplify generate() function
|
2023-02-02 13:47:08 -03:00 |
|
oobabooga
|
3f05cf5ddd
|
Simplify encode() function
|
2023-02-02 13:31:32 -03:00 |
|
oobabooga
|
2583bc5840
|
Simplify deepspeed implementation (#40)
|
2023-02-02 12:15:44 -03:00 |
|
oobabooga
|
f38c9bf428
|
Fix deepspeed (oops)
|
2023-02-02 10:39:37 -03:00 |
|
oobabooga
|
90f1067598
|
Move deepspeed parameters to another file
|
2023-02-02 10:25:09 -03:00 |
|
81300
|
248ec4fa21
|
Merge branch 'oobabooga:main' into ds
|
2023-02-01 20:50:51 +02:00 |
|
81300
|
a6f4760772
|
Add arg for bfloat16
|
2023-02-01 20:22:07 +02:00 |
|
81300
|
c515282f5c
|
no_split_module_classes not needed
|
2023-02-01 19:47:26 +02:00 |
|
81300
|
0a0d289537
|
Fix issue with generating on multiple GPUs
|
2023-02-01 19:02:07 +02:00 |
|
81300
|
a97afa6965
|
Add DeepSpeed ZeRO-3 integration
|
2023-02-01 18:48:13 +02:00 |
|
oobabooga
|
6b13816c47
|
Change default --disk behavior
|
2023-02-01 10:43:28 -03:00 |
|
oobabooga
|
119be56390
|
Add back low_cpu_mem_usage=True
Removing it didn't help with anything, so I am adding it bad on a purely
superstiticious basis.
|
2023-02-01 10:01:44 -03:00 |
|
oobabooga
|
d4a0b377ab
|
Allow standalone --cpu-memory
I think that what I am doing probably makes sense, but I could be wrong.
|
2023-01-31 21:23:16 -03:00 |
|
oobabooga
|
8ef89df746
|
Try to leave at least 1GiB free to prevent oom errors
|
2023-01-31 20:47:05 -03:00 |
|
oobabooga
|
bb77f20a6c
|
Don't use low_cpu_mem_usage and device_map together
|
2023-01-31 13:24:05 -03:00 |
|
oobabooga
|
001ecf95b2
|
Update server.py
|
2023-01-31 08:14:16 -03:00 |
|
Silver267
|
a85bb5e9a2
|
Fix an error
Fixes "UnboundLocalError: local variable 'substring_found' referenced before assignment" when loading non-pygmalion models in cai chat mode.
|
2023-01-31 01:34:10 -05:00 |
|
oobabooga
|
5b0bbfa6e8
|
Clean up
|
2023-01-30 14:17:12 -03:00 |
|
oobabooga
|
2dadf42cb5
|
Print the tokenized example dialogue in a prettier way
|
2023-01-30 08:29:49 -03:00 |
|
oobabooga
|
161cae001b
|
I needed this
|
2023-01-29 23:20:22 -03:00 |
|
oobabooga
|
3ebca480f6
|
Minor fix
|
2023-01-29 23:05:17 -03:00 |
|
oobabooga
|
00707a0b3b
|
Add "Impersonate" button
|
2023-01-29 22:56:23 -03:00 |
|
oobabooga
|
de72e83508
|
Reorganize things
|
2023-01-29 14:27:22 -03:00 |
|
oobabooga
|
6fbfee9e6d
|
Remove some bloat
|
2023-01-29 12:05:18 -03:00 |
|
oobabooga
|
9c9bd1074f
|
Add option to replace the bot's last reply
|
2023-01-29 12:02:44 -03:00 |
|
oobabooga
|
e5ff4ddfc8
|
Add bot prefix modifier option in extensions
|
2023-01-29 10:11:59 -03:00 |
|
oobabooga
|
b6d01bb704
|
Enable extensions in all modes, not just chat
|
2023-01-29 09:48:18 -03:00 |
|
oobabooga
|
1a139664f5
|
Grammar
|
2023-01-29 02:54:36 -03:00 |
|
oobabooga
|
2d134031ca
|
Apply extensions to character greeting
|
2023-01-29 00:04:11 -03:00 |
|
oobabooga
|
e349b52256
|
Read extensions parameters from settings file
|
2023-01-28 23:21:40 -03:00 |
|
oobabooga
|
2239be2351
|
Support for number/bool extension parameters
|
2023-01-28 23:08:28 -03:00 |
|
oobabooga
|
6da94e358c
|
Add support for extensions parameters
Still experimental
|
2023-01-28 23:00:51 -03:00 |
|
oobabooga
|
e779fd795f
|
Save TavernAI characters with TavernAI- prefix
|
2023-01-28 21:01:56 -03:00 |
|
oobabooga
|
833a1138fa
|
Explain the dialogue tokenization output
|
2023-01-28 20:41:02 -03:00 |
|
oobabooga
|
545b7395b2
|
Prevent huge --help outputs
|
2023-01-28 20:36:51 -03:00 |
|
oobabooga
|
f4c455ce29
|
Merge pull request #30 from 10sa/patch-1
Add listening port options for listening mode.
|
2023-01-28 20:35:20 -03:00 |
|
oobabooga
|
7b283a4a3d
|
Update server.py
|
2023-01-28 20:35:05 -03:00 |
|
oobabooga
|
f4674d34a9
|
Reorganize chat UI elements
|
2023-01-28 20:28:08 -03:00 |
|
oobabooga
|
3687962e6c
|
Add support for TavernAI character cards (closes #31)
|
2023-01-28 20:18:23 -03:00 |
|
oobabooga
|
f71531186b
|
Upload profile pictures from the web UI
|
2023-01-28 19:16:37 -03:00 |
|
Tensa
|
3742d3b18a
|
Add listening port options for listening mode.
|
2023-01-28 03:38:34 +09:00 |
|
oobabooga
|
69ffef4391
|
History loading minor bug fix
|
2023-01-27 12:01:11 -03:00 |
|
oobabooga
|
8b8236c6ff
|
Fix Regenerate button bug
|
2023-01-27 11:14:19 -03:00 |
|
oobabooga
|
1d1f931757
|
Load extensions at startup
|
2023-01-27 10:53:05 -03:00 |
|
oobabooga
|
70e034589f
|
Update the export/load chat history functions
|
2023-01-27 02:16:05 -03:00 |
|
oobabooga
|
6b5dcd46c5
|
Add support for extensions
This is experimental.
|
2023-01-27 00:40:39 -03:00 |
|
oobabooga
|
e69990e37b
|
Change order of upload and download tabs in chat mode
|
2023-01-26 16:57:12 -03:00 |
|
oobabooga
|
ac6065d5ed
|
Fix character loading bug
|
2023-01-26 13:45:19 -03:00 |
|
oobabooga
|
61611197e0
|
Add --verbose option (oops)
|
2023-01-26 02:18:06 -03:00 |
|
oobabooga
|
abc920752f
|
Stop at eos_token while streaming text (for #26)
|
2023-01-25 22:27:04 -03:00 |
|
oobabooga
|
b77933d327
|
File names must be img_me.jpg and img_bot.jpg
|
2023-01-25 19:40:30 -03:00 |
|
oobabooga
|
fc73188ec7
|
Allow specifying your own profile picture in chat mode
|
2023-01-25 19:37:44 -03:00 |
|
oobabooga
|
3fa14befc5
|
Bump the gradio version, add back the queue
|
2023-01-25 16:10:35 -03:00 |
|
oobabooga
|
7a3717b824
|
Allow uploading characters
|
2023-01-25 15:45:25 -03:00 |
|
oobabooga
|
6388c7fbc0
|
Set queue size to 1 to prevent gradio undefined behavior
|
2023-01-25 14:37:41 -03:00 |
|
oobabooga
|
ec69c190ba
|
Keep the character's greeting/example dialogue when "clear history" is clicked
|
2023-01-25 10:52:35 -03:00 |
|
oobabooga
|
ebed1dea56
|
Generate 8 tokens at a time in streaming mode instead of just 1
This is a performance optimization.
|
2023-01-25 10:38:26 -03:00 |
|
oobabooga
|
3b8f0021cc
|
Stop generating at \nYou: in chat mode
|
2023-01-25 10:17:55 -03:00 |
|
oobabooga
|
54e77acac4
|
Rename to "Generation parameters preset" for clarity
|
2023-01-23 20:49:44 -03:00 |
|
oobabooga
|
ce4756fb88
|
Allow uploading chat history in official pygmalion web ui format
|
2023-01-23 15:29:01 -03:00 |
|
oobabooga
|
8325e23923
|
Fix bug in loading chat history as text file
|
2023-01-23 14:28:02 -03:00 |
|
oobabooga
|
059d47edb5
|
Submit with enter instead of shift+enter in chat mode
|
2023-01-23 14:04:01 -03:00 |
|
oobabooga
|
4820379139
|
Add debug preset (deterministic, should always give the same responses)
|
2023-01-23 13:36:01 -03:00 |
|
oobabooga
|
947b50e8ea
|
Allow uploading chat history as simple text files
|
2023-01-23 09:45:10 -03:00 |
|
oobabooga
|
ebf720585b
|
Mention time and it/s in terminal with streaming off
|
2023-01-22 20:07:19 -03:00 |
|
oobabooga
|
d87310ad61
|
Send last input to the input box when "Remove last" is clicked
|
2023-01-22 19:40:22 -03:00 |
|
oobabooga
|
d0ea6d5f86
|
Make the maximum history size in prompt unlimited by default
|
2023-01-22 17:17:35 -03:00 |
|
oobabooga
|
00f3b0996b
|
Warn the user that chat mode becomes a lot slower with text streaming
|
2023-01-22 16:19:11 -03:00 |
|
oobabooga
|
c5cc3a3075
|
Fix bug in "remove last" button
|
2023-01-22 13:10:36 -03:00 |
|
oobabooga
|
a410cf1345
|
Mention that "Chat history size" means "Chat history size in prompt"
|
2023-01-22 03:15:35 -03:00 |
|
oobabooga
|
b3e1a874bc
|
Fix bug in loading history
|
2023-01-22 02:32:54 -03:00 |
|
oobabooga
|
62b533f344
|
Add "regenerate" button to the chat
|
2023-01-22 02:19:58 -03:00 |
|
oobabooga
|
94ecbc6dff
|
Export history as nicely formatted json
|
2023-01-22 01:24:16 -03:00 |
|
oobabooga
|
deacb96c34
|
Change the pygmalion default context
|
2023-01-22 00:49:59 -03:00 |
|
oobabooga
|
23f94f559a
|
Improve the chat prompt design
|
2023-01-22 00:35:42 -03:00 |
|
oobabooga
|
139e2f0ab4
|
Redesign the upload/download chat history buttons
|
2023-01-22 00:22:50 -03:00 |
|
oobabooga
|
434d4b128c
|
Add refresh buttons for the model/preset/character menus
|
2023-01-22 00:02:46 -03:00 |
|
oobabooga
|
1e5e56fa2e
|
Better recognize the 4chan model (for #19)
|
2023-01-21 22:13:01 -03:00 |
|
oobabooga
|
aadf4e899a
|
Improve example dialogue handling
|
2023-01-21 15:04:13 -03:00 |
|
oobabooga
|
f9dbe7e08e
|
Update README
|
2023-01-21 03:05:55 -03:00 |
|
oobabooga
|
27e2d932b0
|
Don't export include the example dialogue in the export json
|
2023-01-21 02:55:13 -03:00 |
|
oobabooga
|
990ee54ddd
|
Move the example dialogue to the chat history, and keep it hidden.
This greatly improves the performance of text generation, as
histories can be quite long. It also makes more sense to implement
it this way.
|
2023-01-21 02:48:06 -03:00 |
|
oobabooga
|
d7299df01f
|
Rename parameters
|
2023-01-21 00:33:41 -03:00 |
|
oobabooga
|
5df03bf0fd
|
Merge branch 'main' into main
|
2023-01-21 00:25:34 -03:00 |
|
oobabooga
|
faaafe7c0e
|
Better parameter naming
|
2023-01-20 23:45:16 -03:00 |
|
Silver267
|
f4634e4c32
|
Update.
|
2023-01-20 17:05:43 -05:00 |
|
oobabooga
|
c0f2367b54
|
Minor fix
|
2023-01-20 17:09:25 -03:00 |
|
oobabooga
|
185587a33e
|
Add a history size parameter to the chat
If too many messages are used in the prompt, the model
gets really slow. It is useful to have the ability to
limit this.
|
2023-01-20 17:03:09 -03:00 |
|
oobabooga
|
78d5a999e6
|
Improve prompt formatation
|
2023-01-20 01:54:38 -03:00 |
|
oobabooga
|
70ff685736
|
Encode the input string correctly
|
2023-01-20 00:45:02 -03:00 |
|
oobabooga
|
b66d18d5a0
|
Allow presets/characters with '.' in their names
|
2023-01-19 21:56:33 -03:00 |
|
oobabooga
|
11c3214981
|
Fix some regexes
|
2023-01-19 19:59:34 -03:00 |
|
oobabooga
|
e61138bdad
|
Minor fixes
|
2023-01-19 19:04:54 -03:00 |
|
oobabooga
|
2181fca709
|
Better defaults for chat
|
2023-01-19 18:58:45 -03:00 |
|
oobabooga
|
83808171d3
|
Add --share option for Colab
|
2023-01-19 17:31:29 -03:00 |
|
oobabooga
|
8d788874d7
|
Add support for characters
|
2023-01-19 16:46:46 -03:00 |
|
oobabooga
|
3121f4788e
|
Fix uploading chat log in --chat mode
|
2023-01-19 15:05:42 -03:00 |
|
oobabooga
|
849e4c7f90
|
Better way of finding the generated reply in the output string
|
2023-01-19 14:57:01 -03:00 |
|
oobabooga
|
d03b0ad7a8
|
Implement saving/loading chat logs (#9)
|
2023-01-19 14:03:47 -03:00 |
|