oobabooga
827ae51f72
Sort the imports
2023-03-07 00:23:36 -03:00
oobabooga
b4bfd87319
Update README.md
2023-03-06 20:55:01 -03:00
oobabooga
8f4a197c05
Add credits
2023-03-06 20:34:36 -03:00
oobabooga
d0e8780555
Update README.md
2023-03-06 20:17:59 -03:00
oobabooga
18eaeebbfe
Merge branch 'main' of github.com:oobabooga/text-generation-webui
2023-03-06 20:16:03 -03:00
oobabooga
18ccfcd7fe
Update README.md
2023-03-06 20:15:55 -03:00
oobabooga
153dfeb4dd
Add --rwkv-cuda-on parameter, bump rwkv version
2023-03-06 20:12:54 -03:00
oobabooga
4143d4aa48
Merge pull request #175 from SagsMug/main
...
Add api example using websockets
2023-03-06 19:53:48 -03:00
oobabooga
8b882c132a
tabs -> spaces
2023-03-06 19:52:26 -03:00
oobabooga
eebec65075
Improve readability
2023-03-06 19:46:46 -03:00
oobabooga
99f69dfcaa
Merge pull request #169 from MetaIX/patch-1
...
Support for Eleven Labs TTS
2023-03-06 19:40:31 -03:00
oobabooga
944fdc03b2
Rename the folder
2023-03-06 19:38:36 -03:00
oobabooga
49ae183ac9
Move new extension to a separate file
2023-03-06 19:28:53 -03:00
oobabooga
91823e1ed1
Update README.md
2023-03-06 16:48:31 -03:00
oobabooga
6904a507c6
Change some parameters
2023-03-06 16:29:43 -03:00
oobabooga
20bd645f6a
Fix bug in multigpu setups (attempt 3)
2023-03-06 15:58:18 -03:00
oobabooga
09a7c36e1b
Minor improvement while running custom models
2023-03-06 15:36:35 -03:00
oobabooga
24c4c20391
Fix bug in multigpu setups (attempt #2 )
2023-03-06 15:23:29 -03:00
oobabooga
d88b7836c6
Fix bug in multigpu setups
2023-03-06 14:58:30 -03:00
oobabooga
5bed607b77
Increase repetition frequency/penalty for RWKV
2023-03-06 14:25:48 -03:00
oobabooga
aa7ce0665e
Merge branch 'main' of github.com:oobabooga/text-generation-webui
2023-03-06 10:58:41 -03:00
oobabooga
bf56b6c1fb
Load settings.json without the need for --settings settings.json
...
This is for setting UI defaults
2023-03-06 10:57:45 -03:00
oobabooga
2de9f122cd
Update README.md
2023-03-06 09:34:49 -03:00
oobabooga
e91f4bc25a
Add RWKV tokenizer
2023-03-06 08:45:49 -03:00
Mug
53ce21ac68
Add api example using websockets
2023-03-06 12:13:50 +01:00
MetaIX
9907bee4a4
Support for Eleven Labs TTS
...
As per your suggestion at https://github.com/oobabooga/text-generation-webui/issues/159 here's my attempt.
I'm brand new to python and github. Completely different from unreal + visual coding, so forgive my amateurish code. This essentially adds support for Eleven Labs TTS. Tested it without major issues, and I believe it's functional (hopefully).
Extra requirements: elevenlabslib https://github.com/lugia19/elevenlabslib , sounddevice0.4.6, and soundfile
Folder structure is the same as the SileroTTS Extension.
2023-03-05 19:04:22 -06:00
oobabooga
c855b828fe
Better handle <USER>
2023-03-05 17:01:47 -03:00
oobabooga
145c725c39
Bump RWKV version
2023-03-05 16:28:21 -03:00
oobabooga
2af66a4d4c
Fix <USER> in pygmalion replies
2023-03-05 16:08:50 -03:00
oobabooga
a54b91af77
Improve readability
2023-03-05 10:21:15 -03:00
oobabooga
8e706df20e
Fix a memory leak when text streaming is on
2023-03-05 10:12:43 -03:00
oobabooga
5492e2e9f8
Add sentencepiece
2023-03-05 10:02:24 -03:00
oobabooga
90206204aa
Merge pull request #163 from oobabooga/hf_llama
...
Move towards HF LLaMA implementation
2023-03-05 01:55:43 -03:00
oobabooga
c33715ad5b
Move towards HF LLaMA implementation
2023-03-05 01:20:31 -03:00
oobabooga
bd8aac8fa4
Add LLaMA 8-bit support
2023-03-04 13:28:42 -03:00
oobabooga
c93f1fa99b
Count the tokens more conservatively
2023-03-04 03:10:21 -03:00
oobabooga
736f61610b
Update README
2023-03-04 01:33:52 -03:00
oobabooga
ed8b35efd2
Add --pin-weight parameter for FlexGen
2023-03-04 01:04:02 -03:00
oobabooga
05e703b4a4
Print the performance information more reliably
2023-03-03 21:24:32 -03:00
oobabooga
5a79863df3
Increase the sequence length, decrease batch size
...
I have no idea what I am doing
2023-03-03 15:54:13 -03:00
oobabooga
e62b9b1074
Revamp the "Default" preset with HF defaults
2023-03-03 15:26:08 -03:00
oobabooga
a345a2acd2
Add a tokenizer placeholder
2023-03-03 15:16:55 -03:00
oobabooga
4cc36dc434
Tweak the Naive preset (for LLaMA/RWKV)
2023-03-03 15:09:00 -03:00
oobabooga
5b354817f6
Make chat minimally work with LLaMA
2023-03-03 15:04:41 -03:00
oobabooga
ea5c5eb3da
Add LLaMA support
2023-03-03 14:39:14 -03:00
oobabooga
2bff646130
Stop chat from flashing dark when processing
2023-03-03 13:19:13 -03:00
oobabooga
7c70e0e2a6
Fix the download script (sort of)
2023-03-02 14:05:21 -03:00
oobabooga
bcea196c9d
Bump flexgen version
2023-03-02 12:03:57 -03:00
oobabooga
76378c6cc2
Update README
2023-03-02 11:27:15 -03:00
oobabooga
169209805d
Model-aware prompts and presets
2023-03-02 11:25:04 -03:00