Commit Graph

419 Commits

Author SHA1 Message Date
oobabooga
1622059179 Move BLIP to the CPU
It's just as fast
2023-02-15 00:03:19 -03:00
oobabooga
d4d90a8000
Merge pull request #76 from SillyLossy/main
Use BLIP to send a picture to model
2023-02-14 23:57:44 -03:00
oobabooga
8c3ef58e00 Use BLIP directly + some simplifications 2023-02-14 23:55:46 -03:00
SillyLossy
a7d98f494a Use BLIP to send a picture to model 2023-02-15 01:38:21 +02:00
oobabooga
79d3a524f2 Add a file 2023-02-14 15:18:05 -03:00
oobabooga
f6bf74dcd5 Add Silero TTS extension 2023-02-14 15:06:06 -03:00
oobabooga
01e5772302
Update README.md 2023-02-14 13:06:26 -03:00
oobabooga
d910d435cd Consider the softprompt in the maximum prompt length calculation 2023-02-14 12:06:47 -03:00
oobabooga
8b3bb512ef Minor bug fix (soft prompt was being loaded twice) 2023-02-13 23:34:04 -03:00
oobabooga
56bbc996a4 Minor CSS change for readability 2023-02-13 23:01:14 -03:00
oobabooga
210c918199
Update README.md 2023-02-13 21:49:19 -03:00
oobabooga
2fe9d7f372 Merge branch 'main' of github.com:oobabooga/text-generation-webui 2023-02-13 18:48:46 -03:00
oobabooga
7739a29524 Some simplifications 2023-02-13 18:48:32 -03:00
oobabooga
b7ddcab53a
Update README.md 2023-02-13 15:52:49 -03:00
oobabooga
3277b751f5 Add softprompt support (for real this time)
Is this too much voodoo for our purposes?
2023-02-13 15:25:16 -03:00
oobabooga
aa1177ff15 Send last internal reply to input rather than visible 2023-02-13 03:29:23 -03:00
oobabooga
61aed97439 Slightly increase a margin 2023-02-12 17:38:54 -03:00
oobabooga
2c3abcf57a Add support for rosey/chip/joi instruct models 2023-02-12 09:46:34 -03:00
oobabooga
7ef7bba6e6 Add progress bar for model loading 2023-02-12 09:36:27 -03:00
oobabooga
939e9d00a2
Update README.md 2023-02-12 00:47:03 -03:00
oobabooga
bf9dd8f8ee Add --text-only option to the download script 2023-02-12 00:42:56 -03:00
oobabooga
42cc307409
Update README.md 2023-02-12 00:34:55 -03:00
oobabooga
66862203fc Only download safetensors if both pytorch and safetensors are present 2023-02-12 00:06:22 -03:00
oobabooga
5d3f15b915 Use the CPU if no GPU is detected 2023-02-11 23:17:06 -03:00
oobabooga
337290777b Rename example extension to "softprompt" 2023-02-11 17:17:10 -03:00
oobabooga
b3c4657c47 Remove commas from preset files 2023-02-11 14:54:29 -03:00
oobabooga
144857acfe Update README 2023-02-11 14:49:11 -03:00
oobabooga
0dd1409f24 Add penalty_alpha parameter (contrastive search) 2023-02-11 14:48:12 -03:00
oobabooga
8aafb55693
1-click installer now also works for AMD GPUs
(I think)
2023-02-11 14:24:47 -03:00
oobabooga
7eed553337 Merge branch 'main' of github.com:oobabooga/text-generation-webui 2023-02-11 08:00:29 -03:00
oobabooga
2ed0386d87 Fix replace last reply in --chat mode (for #69) 2023-02-11 07:59:54 -03:00
oobabooga
1e97cb9570
Merge pull request #68 from Spencer-Dawson/patch-1
Added ROCm Install instructions to README
2023-02-11 07:56:30 -03:00
oobabooga
1176d64b13
Update README.md 2023-02-11 07:56:12 -03:00
Spencer-Dawson
c5324d653b
re-added missed README changes 2023-02-11 00:13:06 -07:00
oobabooga
cf89ef1c74
Update README.md 2023-02-10 21:46:29 -03:00
oobabooga
8782ac1911
Update README.md 2023-02-10 17:10:27 -03:00
oobabooga
7d7cc37560
Add Linux 1-click installer 2023-02-10 17:09:53 -03:00
oobabooga
316e07f06a auto-assign gpu memory with --auto-devices alone 2023-02-10 16:36:06 -03:00
oobabooga
76d3d7ddb3 Reorder the imports here too 2023-02-10 15:57:55 -03:00
oobabooga
219366342b Sort imports according to PEP8 (based on #67) 2023-02-10 15:40:03 -03:00
oobabooga
96d56d4f3c Turn the example script into a soft prompt script 2023-02-10 15:24:26 -03:00
oobabooga
e0b164feab
Merge pull request #66 from 81300/bf16
Extend bfloat16 support
2023-02-09 15:13:17 -03:00
81300
20dbef9623
Extend bfloat16 support 2023-02-09 20:00:03 +02:00
oobabooga
991de5ed40
Update README.md 2023-02-09 14:36:47 -03:00
oobabooga
04d3d0aee6
Add 1-click windows installer (for #45) 2023-02-09 13:27:30 -03:00
oobabooga
a21620fc59 Update README 2023-02-08 01:17:50 -03:00
oobabooga
cadd100405 min_length has to be 0 when streaming is on 2023-02-08 00:23:35 -03:00
oobabooga
6be571cff7 Better variable names 2023-02-08 00:19:20 -03:00
oobabooga
fc0493d885 Add credits 2023-02-08 00:09:41 -03:00
oobabooga
58b07cca81 length_penalty can be negative (apparently) 2023-02-07 23:33:02 -03:00