Commit Graph

320 Commits

Author SHA1 Message Date
oobabooga
0a3590da8c Add a progress bar 2023-02-24 14:19:27 -03:00
oobabooga
3b8cecbab7 Reload the default chat on page refresh 2023-02-23 19:50:23 -03:00
oobabooga
f1914115d3 Fix minor issue with chat logs 2023-02-23 16:04:47 -03:00
oobabooga
2e86a1ec04 Move chat history into shared module 2023-02-23 15:11:18 -03:00
oobabooga
c87800341c Move function to extensions module 2023-02-23 14:55:21 -03:00
oobabooga
7224343a70 Improve the imports 2023-02-23 14:41:42 -03:00
oobabooga
364529d0c7 Further refactor 2023-02-23 14:31:28 -03:00
oobabooga
e46c43afa6 Move some stuff from server.py to modules 2023-02-23 13:42:23 -03:00
oobabooga
1dacd34165 Further refactor 2023-02-23 13:28:30 -03:00
oobabooga
ce7feb3641 Further refactor 2023-02-23 13:03:52 -03:00
oobabooga
98af4bfb0d Refactor the code to make it more modular 2023-02-23 12:05:25 -03:00
oobabooga
18e0ec955e Improve some descriptions in --help 2023-02-23 10:11:58 -03:00
oobabooga
c72892835a Don't show *-np models in the list of choices 2023-02-22 11:38:16 -03:00
oobabooga
044b963987 Add stop parameter for flexgen (#105) 2023-02-22 11:23:36 -03:00
oobabooga
ea21a22940 Remove redundant preset 2023-02-22 01:01:26 -03:00
oobabooga
b8b3d4139c Add --compress-weight parameter 2023-02-22 00:43:21 -03:00
oobabooga
eef6fc3cbf Add a preset for FlexGen 2023-02-21 23:33:15 -03:00
oobabooga
311404e258 Reuse disk-cache-dir parameter for flexgen 2023-02-21 22:11:05 -03:00
oobabooga
f3c75bbd64 Add --percent flag for flexgen 2023-02-21 22:08:46 -03:00
oobabooga
b83f51ee04 Add FlexGen support #92 (experimental) 2023-02-21 21:00:06 -03:00
oobabooga
444cd69c67 Fix regex bug in loading character jsons with special characters 2023-02-20 19:38:19 -03:00
oobabooga
d7a738fb7a Load any 13b/20b/30b model in 8-bit mode when no flags are supplied 2023-02-20 15:44:10 -03:00
oobabooga
77846ceef3 Minor change 2023-02-20 15:05:48 -03:00
oobabooga
e195377050 Deprecate torch dumps, move to safetensors (they load even faster) 2023-02-20 15:03:19 -03:00
oobabooga
14ffa0b418 Fix line breaks in --chat mode 2023-02-20 13:25:46 -03:00
SillyLossy
ded890c378 Escape regexp in message extraction 2023-02-19 12:55:45 +02:00
oobabooga
8c9dd95d55
Print the softprompt metadata when it is loaded 2023-02-19 01:48:23 -03:00
oobabooga
f79805f4a4
Change a comment 2023-02-18 22:58:40 -03:00
oobabooga
d58544a420 Some minor formatting changes 2023-02-18 11:07:55 -03:00
oobabooga
0dd41e4830 Reorganize the sliders some more 2023-02-17 16:33:27 -03:00
oobabooga
6b9ac2f88e Reorganize the generation parameters 2023-02-17 16:18:01 -03:00
oobabooga
596732a981 The soft prompt length must be considered here too 2023-02-17 12:35:30 -03:00
oobabooga
edc0262889 Minor file uploading fixes 2023-02-17 10:27:41 -03:00
oobabooga
243244eeec Attempt at fixing greyed out files on iphone 2023-02-17 10:17:15 -03:00
oobabooga
a226f4cddb No change, so reverting 2023-02-17 09:27:17 -03:00
oobabooga
40cb9f63f6 Try making Colab happy (tensorflow warnings) 2023-02-17 09:23:11 -03:00
oobabooga
aeddf902ec Make the refresh button prettier 2023-02-16 21:55:20 -03:00
oobabooga
21512e2790 Make the Stop button work more reliably 2023-02-16 21:21:45 -03:00
oobabooga
08805b3374 Force "You" in impersonate too 2023-02-16 13:24:13 -03:00
oobabooga
d7db04403f Fix --chat chatbox height 2023-02-16 12:45:05 -03:00
oobabooga
589069e105 Don't regenerate if no message has been sent 2023-02-16 12:32:35 -03:00
oobabooga
405dfbf57c Force your name to be "You" for pygmalion (properly) 2023-02-16 12:16:12 -03:00
oobabooga
7bd2ae05bf Force your name to be "You" for pygmalion
This allows you to customize your displayed name.
2023-02-15 21:32:53 -03:00
oobabooga
3746d72853 More style fixes 2023-02-15 21:13:12 -03:00
oobabooga
6f213b8c14 Style fix 2023-02-15 20:58:17 -03:00
oobabooga
ccf10db60f Move stuff into tabs in chat mode 2023-02-15 20:55:32 -03:00
oobabooga
a55e8836f6 Bump gradio version
It looks uglier, but the old one was bugged and unstable.
2023-02-15 20:20:56 -03:00
oobabooga
0e89ff4b13 Clear the persistent history after clicking on "Clear history" 2023-02-15 16:49:52 -03:00
oobabooga
b3bcd2881d Implement regenerate/impersonate the proper way (fixes #78) 2023-02-15 14:39:26 -03:00
oobabooga
5ee9283cae Mention BLIP 2023-02-15 13:53:38 -03:00
oobabooga
8d3b3959e7 Document --picture option 2023-02-15 13:50:18 -03:00
oobabooga
2eea0f4edb Minor change 2023-02-15 12:58:11 -03:00
oobabooga
3c31fa7079 Simplifications 2023-02-15 12:46:11 -03:00
oobabooga
80fbc584f7 Readability 2023-02-15 11:38:44 -03:00
oobabooga
b397bea387 Make chat history persistent 2023-02-15 11:30:38 -03:00
oobabooga
7be372829d Set chat prompt size in tokens 2023-02-15 10:18:50 -03:00
oobabooga
8c3ef58e00 Use BLIP directly + some simplifications 2023-02-14 23:55:46 -03:00
SillyLossy
a7d98f494a Use BLIP to send a picture to model 2023-02-15 01:38:21 +02:00
oobabooga
d910d435cd Consider the softprompt in the maximum prompt length calculation 2023-02-14 12:06:47 -03:00
oobabooga
8b3bb512ef Minor bug fix (soft prompt was being loaded twice) 2023-02-13 23:34:04 -03:00
oobabooga
7739a29524 Some simplifications 2023-02-13 18:48:32 -03:00
oobabooga
3277b751f5 Add softprompt support (for real this time)
Is this too much voodoo for our purposes?
2023-02-13 15:25:16 -03:00
oobabooga
aa1177ff15 Send last internal reply to input rather than visible 2023-02-13 03:29:23 -03:00
oobabooga
2c3abcf57a Add support for rosey/chip/joi instruct models 2023-02-12 09:46:34 -03:00
oobabooga
7ef7bba6e6 Add progress bar for model loading 2023-02-12 09:36:27 -03:00
oobabooga
5d3f15b915 Use the CPU if no GPU is detected 2023-02-11 23:17:06 -03:00
oobabooga
b3c4657c47 Remove commas from preset files 2023-02-11 14:54:29 -03:00
oobabooga
0dd1409f24 Add penalty_alpha parameter (contrastive search) 2023-02-11 14:48:12 -03:00
oobabooga
2ed0386d87 Fix replace last reply in --chat mode (for #69) 2023-02-11 07:59:54 -03:00
oobabooga
316e07f06a auto-assign gpu memory with --auto-devices alone 2023-02-10 16:36:06 -03:00
oobabooga
219366342b Sort imports according to PEP8 (based on #67) 2023-02-10 15:40:03 -03:00
81300
20dbef9623
Extend bfloat16 support 2023-02-09 20:00:03 +02:00
oobabooga
cadd100405 min_length has to be 0 when streaming is on 2023-02-08 00:23:35 -03:00
oobabooga
6be571cff7 Better variable names 2023-02-08 00:19:20 -03:00
oobabooga
58b07cca81 length_penalty can be negative (apparently) 2023-02-07 23:33:02 -03:00
oobabooga
7e4c25691d Repetition penalty has to be < 5 2023-02-07 23:23:39 -03:00
oobabooga
1c30e1b49a Add even more sliders 2023-02-07 23:11:04 -03:00
oobabooga
24dc705eca Add lots of sliders 2023-02-07 22:08:21 -03:00
Martin J
06a4664805 Fix a regex issue in tokenize_dialogue.
The existing regex would fail if using character names that start with
numbers, for example: 9S or 2B.
2023-02-05 07:42:57 +01:00
oobabooga
2fe235738e Reorganize chat buttons 2023-02-04 22:53:42 -03:00
oobabooga
2207d44986 Windows doesn't like : in filenames 2023-02-04 20:07:39 -03:00
oobabooga
65266f3349 Fix loading official colab chat logs 2023-02-03 22:43:02 -03:00
oobabooga
44e8c671f9 Fix API documentation formatting in chat mode 2023-02-03 10:00:05 -03:00
oobabooga
a28f0d8bd7 Show it/s in the same units with or without streaming
Closes #49
2023-02-03 09:11:11 -03:00
oobabooga
4e4cd67223 Save chat history with name/date in filename
closes #50
2023-02-03 09:02:35 -03:00
oobabooga
3af3ffeb90 Make --help output more readable 2023-02-02 23:36:28 -03:00
oobabooga
638495b633 Simplify generate() function 2023-02-02 13:47:08 -03:00
oobabooga
3f05cf5ddd Simplify encode() function 2023-02-02 13:31:32 -03:00
oobabooga
2583bc5840 Simplify deepspeed implementation (#40) 2023-02-02 12:15:44 -03:00
oobabooga
f38c9bf428 Fix deepspeed (oops) 2023-02-02 10:39:37 -03:00
oobabooga
90f1067598 Move deepspeed parameters to another file 2023-02-02 10:25:09 -03:00
81300
248ec4fa21
Merge branch 'oobabooga:main' into ds 2023-02-01 20:50:51 +02:00
81300
a6f4760772
Add arg for bfloat16 2023-02-01 20:22:07 +02:00
81300
c515282f5c
no_split_module_classes not needed 2023-02-01 19:47:26 +02:00
81300
0a0d289537
Fix issue with generating on multiple GPUs 2023-02-01 19:02:07 +02:00
81300
a97afa6965
Add DeepSpeed ZeRO-3 integration 2023-02-01 18:48:13 +02:00
oobabooga
6b13816c47 Change default --disk behavior 2023-02-01 10:43:28 -03:00
oobabooga
119be56390 Add back low_cpu_mem_usage=True
Removing it didn't help with anything, so I am adding it bad on a purely
superstiticious basis.
2023-02-01 10:01:44 -03:00
oobabooga
d4a0b377ab Allow standalone --cpu-memory
I think that what I am doing probably makes sense, but I could be wrong.
2023-01-31 21:23:16 -03:00
oobabooga
8ef89df746 Try to leave at least 1GiB free to prevent oom errors 2023-01-31 20:47:05 -03:00