Commit Graph

198 Commits

Author SHA1 Message Date
oobabooga
cadd100405 min_length has to be 0 when streaming is on 2023-02-08 00:23:35 -03:00
oobabooga
6be571cff7 Better variable names 2023-02-08 00:19:20 -03:00
oobabooga
58b07cca81 length_penalty can be negative (apparently) 2023-02-07 23:33:02 -03:00
oobabooga
7e4c25691d Repetition penalty has to be < 5 2023-02-07 23:23:39 -03:00
oobabooga
1c30e1b49a Add even more sliders 2023-02-07 23:11:04 -03:00
oobabooga
24dc705eca Add lots of sliders 2023-02-07 22:08:21 -03:00
Martin J
06a4664805 Fix a regex issue in tokenize_dialogue.
The existing regex would fail if using character names that start with
numbers, for example: 9S or 2B.
2023-02-05 07:42:57 +01:00
oobabooga
2fe235738e Reorganize chat buttons 2023-02-04 22:53:42 -03:00
oobabooga
2207d44986 Windows doesn't like : in filenames 2023-02-04 20:07:39 -03:00
oobabooga
65266f3349 Fix loading official colab chat logs 2023-02-03 22:43:02 -03:00
oobabooga
44e8c671f9 Fix API documentation formatting in chat mode 2023-02-03 10:00:05 -03:00
oobabooga
a28f0d8bd7 Show it/s in the same units with or without streaming
Closes #49
2023-02-03 09:11:11 -03:00
oobabooga
4e4cd67223 Save chat history with name/date in filename
closes #50
2023-02-03 09:02:35 -03:00
oobabooga
3af3ffeb90 Make --help output more readable 2023-02-02 23:36:28 -03:00
oobabooga
638495b633 Simplify generate() function 2023-02-02 13:47:08 -03:00
oobabooga
3f05cf5ddd Simplify encode() function 2023-02-02 13:31:32 -03:00
oobabooga
2583bc5840 Simplify deepspeed implementation (#40) 2023-02-02 12:15:44 -03:00
oobabooga
f38c9bf428 Fix deepspeed (oops) 2023-02-02 10:39:37 -03:00
oobabooga
90f1067598 Move deepspeed parameters to another file 2023-02-02 10:25:09 -03:00
81300
248ec4fa21
Merge branch 'oobabooga:main' into ds 2023-02-01 20:50:51 +02:00
81300
a6f4760772
Add arg for bfloat16 2023-02-01 20:22:07 +02:00
81300
c515282f5c
no_split_module_classes not needed 2023-02-01 19:47:26 +02:00
81300
0a0d289537
Fix issue with generating on multiple GPUs 2023-02-01 19:02:07 +02:00
81300
a97afa6965
Add DeepSpeed ZeRO-3 integration 2023-02-01 18:48:13 +02:00
oobabooga
6b13816c47 Change default --disk behavior 2023-02-01 10:43:28 -03:00
oobabooga
119be56390 Add back low_cpu_mem_usage=True
Removing it didn't help with anything, so I am adding it bad on a purely
superstiticious basis.
2023-02-01 10:01:44 -03:00
oobabooga
d4a0b377ab Allow standalone --cpu-memory
I think that what I am doing probably makes sense, but I could be wrong.
2023-01-31 21:23:16 -03:00
oobabooga
8ef89df746 Try to leave at least 1GiB free to prevent oom errors 2023-01-31 20:47:05 -03:00
oobabooga
bb77f20a6c Don't use low_cpu_mem_usage and device_map together 2023-01-31 13:24:05 -03:00
oobabooga
001ecf95b2
Update server.py 2023-01-31 08:14:16 -03:00
Silver267
a85bb5e9a2
Fix an error
Fixes "UnboundLocalError: local variable 'substring_found' referenced before assignment" when loading non-pygmalion models in cai chat mode.
2023-01-31 01:34:10 -05:00
oobabooga
5b0bbfa6e8 Clean up 2023-01-30 14:17:12 -03:00
oobabooga
2dadf42cb5 Print the tokenized example dialogue in a prettier way 2023-01-30 08:29:49 -03:00
oobabooga
161cae001b I needed this 2023-01-29 23:20:22 -03:00
oobabooga
3ebca480f6 Minor fix 2023-01-29 23:05:17 -03:00
oobabooga
00707a0b3b Add "Impersonate" button 2023-01-29 22:56:23 -03:00
oobabooga
de72e83508 Reorganize things 2023-01-29 14:27:22 -03:00
oobabooga
6fbfee9e6d Remove some bloat 2023-01-29 12:05:18 -03:00
oobabooga
9c9bd1074f Add option to replace the bot's last reply 2023-01-29 12:02:44 -03:00
oobabooga
e5ff4ddfc8 Add bot prefix modifier option in extensions 2023-01-29 10:11:59 -03:00
oobabooga
b6d01bb704 Enable extensions in all modes, not just chat 2023-01-29 09:48:18 -03:00
oobabooga
1a139664f5 Grammar 2023-01-29 02:54:36 -03:00
oobabooga
2d134031ca Apply extensions to character greeting 2023-01-29 00:04:11 -03:00
oobabooga
e349b52256 Read extensions parameters from settings file 2023-01-28 23:21:40 -03:00
oobabooga
2239be2351 Support for number/bool extension parameters 2023-01-28 23:08:28 -03:00
oobabooga
6da94e358c Add support for extensions parameters
Still experimental
2023-01-28 23:00:51 -03:00
oobabooga
e779fd795f Save TavernAI characters with TavernAI- prefix 2023-01-28 21:01:56 -03:00
oobabooga
833a1138fa Explain the dialogue tokenization output 2023-01-28 20:41:02 -03:00
oobabooga
545b7395b2 Prevent huge --help outputs 2023-01-28 20:36:51 -03:00
oobabooga
f4c455ce29
Merge pull request #30 from 10sa/patch-1
Add listening port options for listening mode.
2023-01-28 20:35:20 -03:00