oobabooga
|
1a658b41aa
|
Merge pull request #43 from 81300/ds
Add DeepSpeed ZeRO-3 integration
|
2023-02-02 10:03:19 -03:00 |
|
oobabooga
|
39461bd796
|
Delete .gitignore
|
2023-02-02 10:01:56 -03:00 |
|
81300
|
248ec4fa21
|
Merge branch 'oobabooga:main' into ds
|
2023-02-01 20:50:51 +02:00 |
|
81300
|
a6f4760772
|
Add arg for bfloat16
|
2023-02-01 20:22:07 +02:00 |
|
81300
|
c515282f5c
|
no_split_module_classes not needed
|
2023-02-01 19:47:26 +02:00 |
|
81300
|
0a0d289537
|
Fix issue with generating on multiple GPUs
|
2023-02-01 19:02:07 +02:00 |
|
81300
|
a97afa6965
|
Add DeepSpeed ZeRO-3 integration
|
2023-02-01 18:48:13 +02:00 |
|
oobabooga
|
6b13816c47
|
Change default --disk behavior
|
2023-02-01 10:43:28 -03:00 |
|
oobabooga
|
119be56390
|
Add back low_cpu_mem_usage=True
Removing it didn't help with anything, so I am adding it bad on a purely
superstiticious basis.
|
2023-02-01 10:01:44 -03:00 |
|
oobabooga
|
d4a0b377ab
|
Allow standalone --cpu-memory
I think that what I am doing probably makes sense, but I could be wrong.
|
2023-01-31 21:23:16 -03:00 |
|
oobabooga
|
efb0ab502e
|
New preset
|
2023-01-31 21:03:25 -03:00 |
|
oobabooga
|
8ef89df746
|
Try to leave at least 1GiB free to prevent oom errors
|
2023-01-31 20:47:05 -03:00 |
|
oobabooga
|
bb77f20a6c
|
Don't use low_cpu_mem_usage and device_map together
|
2023-01-31 13:24:05 -03:00 |
|
oobabooga
|
824329749d
|
Merge pull request #38 from Silver267/patch-2
Fix an error
|
2023-01-31 08:14:50 -03:00 |
|
oobabooga
|
001ecf95b2
|
Update server.py
|
2023-01-31 08:14:16 -03:00 |
|
Silver267
|
a85bb5e9a2
|
Fix an error
Fixes "UnboundLocalError: local variable 'substring_found' referenced before assignment" when loading non-pygmalion models in cai chat mode.
|
2023-01-31 01:34:10 -05:00 |
|
oobabooga
|
5b0bbfa6e8
|
Clean up
|
2023-01-30 14:17:12 -03:00 |
|
oobabooga
|
7aa3d6583e
|
Update README.md
|
2023-01-30 09:45:31 -03:00 |
|
oobabooga
|
239f96a9c5
|
Add extensions guide
|
2023-01-30 09:44:57 -03:00 |
|
oobabooga
|
dfbca86533
|
Add **bold** support in chat mode
|
2023-01-30 08:36:58 -03:00 |
|
oobabooga
|
2dadf42cb5
|
Print the tokenized example dialogue in a prettier way
|
2023-01-30 08:29:49 -03:00 |
|
oobabooga
|
161cae001b
|
I needed this
|
2023-01-29 23:20:22 -03:00 |
|
oobabooga
|
3ebca480f6
|
Minor fix
|
2023-01-29 23:05:17 -03:00 |
|
oobabooga
|
00707a0b3b
|
Add "Impersonate" button
|
2023-01-29 22:56:23 -03:00 |
|
oobabooga
|
ad148571f4
|
Add fixed Google Translation extension
|
2023-01-29 14:55:24 -03:00 |
|
oobabooga
|
b19415849c
|
Merge branch 'main' of github.com:oobabooga/text-generation-webui
|
2023-01-29 14:41:41 -03:00 |
|
oobabooga
|
584a7dd50d
|
Remove defective extension
|
2023-01-29 14:40:14 -03:00 |
|
oobabooga
|
f92996b3c8
|
Update README.md
|
2023-01-29 14:37:05 -03:00 |
|
oobabooga
|
de72e83508
|
Reorganize things
|
2023-01-29 14:27:22 -03:00 |
|
oobabooga
|
6fbfee9e6d
|
Remove some bloat
|
2023-01-29 12:05:18 -03:00 |
|
oobabooga
|
9c9bd1074f
|
Add option to replace the bot's last reply
|
2023-01-29 12:02:44 -03:00 |
|
oobabooga
|
e5ff4ddfc8
|
Add bot prefix modifier option in extensions
|
2023-01-29 10:11:59 -03:00 |
|
oobabooga
|
c1c129196e
|
Merge branch 'main' of github.com:oobabooga/text-generation-webui
|
2023-01-29 09:48:44 -03:00 |
|
oobabooga
|
b6d01bb704
|
Enable extensions in all modes, not just chat
|
2023-01-29 09:48:18 -03:00 |
|
oobabooga
|
ba33e6f1d3
|
Delete .github/ISSUE_TEMPLATE directory
|
2023-01-29 09:37:05 -03:00 |
|
oobabooga
|
af09b0ed9a
|
Update issue templates
|
2023-01-29 09:36:11 -03:00 |
|
oobabooga
|
9e4db10cd0
|
Update README.md
|
2023-01-29 03:13:22 -03:00 |
|
oobabooga
|
1a139664f5
|
Grammar
|
2023-01-29 02:54:36 -03:00 |
|
oobabooga
|
2d134031ca
|
Apply extensions to character greeting
|
2023-01-29 00:04:11 -03:00 |
|
oobabooga
|
c9447da898
|
Rename extension
|
2023-01-28 23:47:15 -03:00 |
|
oobabooga
|
1a8d815de4
|
Make src and dst languages explicit
|
2023-01-28 23:29:17 -03:00 |
|
oobabooga
|
7ff68ef252
|
Add Google Translate extension
|
2023-01-28 23:26:07 -03:00 |
|
oobabooga
|
e349b52256
|
Read extensions parameters from settings file
|
2023-01-28 23:21:40 -03:00 |
|
oobabooga
|
2239be2351
|
Support for number/bool extension parameters
|
2023-01-28 23:08:28 -03:00 |
|
oobabooga
|
6da94e358c
|
Add support for extensions parameters
Still experimental
|
2023-01-28 23:00:51 -03:00 |
|
oobabooga
|
e779fd795f
|
Save TavernAI characters with TavernAI- prefix
|
2023-01-28 21:01:56 -03:00 |
|
oobabooga
|
833a1138fa
|
Explain the dialogue tokenization output
|
2023-01-28 20:41:02 -03:00 |
|
oobabooga
|
89c862b179
|
Update README
|
2023-01-28 20:37:43 -03:00 |
|
oobabooga
|
545b7395b2
|
Prevent huge --help outputs
|
2023-01-28 20:36:51 -03:00 |
|
oobabooga
|
f4c455ce29
|
Merge pull request #30 from 10sa/patch-1
Add listening port options for listening mode.
|
2023-01-28 20:35:20 -03:00 |
|