oobabooga
|
ee95e55df6
|
Fix RWKV tokenizer
|
2023-03-27 23:42:29 -03:00 |
|
oobabooga
|
036163a751
|
Change description
|
2023-03-27 23:39:26 -03:00 |
|
oobabooga
|
30585b3e71
|
Update README
|
2023-03-27 23:35:01 -03:00 |
|
oobabooga
|
005f552ea3
|
Some simplifications
|
2023-03-27 23:29:52 -03:00 |
|
oobabooga
|
fde92048af
|
Merge branch 'main' into catalpaaa-lora-and-model-dir
|
2023-03-27 23:16:44 -03:00 |
|
oobabooga
|
641e1a09a7
|
Don't flash when selecting a new prompt
|
2023-03-27 14:48:43 -03:00 |
|
oobabooga
|
48a6c9513e
|
Merge pull request #572 from clusterfudge/issues/571
Potential fix for issues/571
|
2023-03-27 14:06:38 -03:00 |
|
oobabooga
|
268abd1cba
|
Add some space in notebook mode
|
2023-03-27 13:52:12 -03:00 |
|
oobabooga
|
af65c12900
|
Change Stop button behavior
|
2023-03-27 13:23:59 -03:00 |
|
oobabooga
|
addb9777f9
|
Increase size of GALACTICA equations
|
2023-03-27 12:59:07 -03:00 |
|
oobabooga
|
572bafcd24
|
Less verbose message
|
2023-03-27 12:43:37 -03:00 |
|
oobabooga
|
202e981d00
|
Make Generate/Stop buttons smaller in notebook mode
|
2023-03-27 12:30:57 -03:00 |
|
oobabooga
|
8e2d94a5a1
|
Add saved promtps to gitignore
|
2023-03-27 12:21:19 -03:00 |
|
oobabooga
|
57345b8f30
|
Add prompt loading/saving menus + reorganize interface
|
2023-03-27 12:16:37 -03:00 |
|
oobabooga
|
3dc61284d5
|
Handle unloading LoRA from dropdown menu icon
|
2023-03-27 00:04:43 -03:00 |
|
oobabooga
|
b6e38e8b97
|
silero_tts streaming fix (#568 from Brawlence/silero_tts-fix)
silero_tts streaming fix
|
2023-03-26 23:59:07 -03:00 |
|
oobabooga
|
af603a142a
|
Unload models on request (#471 from Brawlence/main)
|
2023-03-26 23:53:39 -03:00 |
|
oobabooga
|
95c97e1747
|
Unload the model using the "Remove all" button
|
2023-03-26 23:47:29 -03:00 |
|
oobabooga
|
e07c9e3093
|
Merge branch 'main' into Brawlence-main
|
2023-03-26 23:40:51 -03:00 |
|
oobabooga
|
511be06dcc
|
Merge branch 'main' of github.com:oobabooga/text-generation-webui
|
2023-03-26 22:21:29 -03:00 |
|
oobabooga
|
1c77fdca4c
|
Change notebook mode appearance
|
2023-03-26 22:20:30 -03:00 |
|
oobabooga
|
9ff6a538b6
|
Bump gradio version
Make sure to upgrade with
`pip install -r requirements.txt --upgrade`
|
2023-03-26 22:11:19 -03:00 |
|
oobabooga
|
a04b7cf264
|
Merge pull request #585 from fkusche/also-download-markdown
Also download Markdown files
|
2023-03-26 14:51:23 -03:00 |
|
Florian Kusche
|
19174842b8
|
Also download Markdown files
|
2023-03-26 19:41:14 +02:00 |
|
oobabooga
|
8222d32240
|
Merge pull request #565 from mcmonkey4eva/improve-gitignore
improve/simplify gitignore
|
2023-03-26 13:31:45 -03:00 |
|
oobabooga
|
49c10c5570
|
Add support for the latest GPTQ models with group-size (#530)
**Warning: old 4-bit weights will not work anymore!**
See here how to get up to date weights: https://github.com/oobabooga/text-generation-webui/wiki/LLaMA-model#step-2-get-the-pre-converted-weights
|
2023-03-26 00:11:33 -03:00 |
|
Sean Fitzgerald
|
0bac80d9eb
|
Potential fix for issues/571
|
2023-03-25 13:08:45 -07:00 |
|
Alex "mcmonkey" Goodwin
|
8134c4b334
|
add training/datsets to gitignore for #570
|
2023-03-25 12:41:18 -07:00 |
|
Φφ
|
1a1e420e65
|
Silero_tts streaming fix
Temporarily suppress the streaming during the audio response as it would interfere with the audio (making it stutter and play anew)
|
2023-03-25 21:33:30 +03:00 |
|
Alex "mcmonkey" Goodwin
|
9ccf505ccd
|
improve/simplify gitignore
- add repositories
- remove the redundant "/*" on folders
- remove the exclusions for files that already exist
|
2023-03-25 10:04:00 -07:00 |
|
oobabooga
|
8c8e8b4450
|
Fix the early stopping callback #559
|
2023-03-25 12:35:52 -03:00 |
|
oobabooga
|
a1f12d607f
|
Merge pull request #538 from Ph0rk0z/display-input-context
Add display of context when input was generated
|
2023-03-25 11:56:18 -03:00 |
|
catalpaaa
|
f740ee558c
|
Merge branch 'oobabooga:main' into lora-and-model-dir
|
2023-03-25 01:28:33 -07:00 |
|
oobabooga
|
70f9565f37
|
Update README.md
|
2023-03-25 02:35:30 -03:00 |
|
oobabooga
|
25be9698c7
|
Fix LoRA on mps
|
2023-03-25 01:18:32 -03:00 |
|
oobabooga
|
3da633a497
|
Merge pull request #529 from EyeDeck/main
Allow loading of .safetensors through GPTQ-for-LLaMa
|
2023-03-24 23:51:01 -03:00 |
|
catalpaaa
|
d51cb8292b
|
Update server.py
yea i should go to bed
|
2023-03-24 17:36:31 -07:00 |
|
catalpaaa
|
9e2963e0c8
|
Update server.py
|
2023-03-24 17:35:45 -07:00 |
|
catalpaaa
|
ec2a1facee
|
Update server.py
|
2023-03-24 17:34:33 -07:00 |
|
catalpaaa
|
b37c54edcf
|
lora-dir, model-dir and login auth
Added lora-dir, model-dir, and a login auth arguments that points to a file contains usernames and passwords in the format of "u:pw,u:pw,..."
|
2023-03-24 17:30:18 -07:00 |
|
oobabooga
|
9fa47c0eed
|
Revert GPTQ_loader.py (accident)
|
2023-03-24 19:57:12 -03:00 |
|
oobabooga
|
a6bf54739c
|
Revert models.py (accident)
|
2023-03-24 19:56:45 -03:00 |
|
oobabooga
|
0a16224451
|
Update GPTQ_loader.py
|
2023-03-24 19:54:36 -03:00 |
|
oobabooga
|
a80aa65986
|
Update models.py
|
2023-03-24 19:53:20 -03:00 |
|
oobabooga
|
507db0929d
|
Do not use empty user messages in chat mode
This allows the bot to send messages by clicking on Generate with empty inputs.
|
2023-03-24 17:22:22 -03:00 |
|
oobabooga
|
6e1b16c2aa
|
Update html_generator.py
|
2023-03-24 17:18:27 -03:00 |
|
oobabooga
|
ffb0187e83
|
Update chat.py
|
2023-03-24 17:17:29 -03:00 |
|
oobabooga
|
c14e598f14
|
Merge pull request #433 from mayaeary/fix/api-reload
Fix api extension duplicating
|
2023-03-24 16:56:10 -03:00 |
|
oobabooga
|
bfe960731f
|
Merge branch 'main' into fix/api-reload
|
2023-03-24 16:54:41 -03:00 |
|
oobabooga
|
4a724ed22f
|
Reorder imports
|
2023-03-24 16:53:56 -03:00 |
|