oobabooga
|
5cd6dd4287
|
Fix no-mmap bug
|
2023-05-16 17:35:49 -03:00 |
|
oobabooga
|
89e37626ab
|
Reorganize chat settings tab
|
2023-05-16 17:22:59 -03:00 |
|
Forkoz
|
d205ec9706
|
Fix Training fails when evaluation dataset is selected (#2099)
Fixes https://github.com/oobabooga/text-generation-webui/issues/2078 from Googulator
|
2023-05-16 13:40:19 -03:00 |
|
Orbitoid
|
428261eede
|
fix: elevenlabs removed the need for the api key for refreshing voices (#2097)
|
2023-05-16 13:34:49 -03:00 |
|
oobabooga
|
cd9be4c2ba
|
Update llama.cpp-models.md
|
2023-05-16 00:49:32 -03:00 |
|
atriantafy
|
26cf8c2545
|
add api port options (#1990)
|
2023-05-15 20:44:16 -03:00 |
|
Andrei
|
e657dd342d
|
Add in-memory cache support for llama.cpp (#1936)
|
2023-05-15 20:19:55 -03:00 |
|
Jakub Strnad
|
0227e738ed
|
Add settings UI for llama.cpp and fixed reloading of llama.cpp models (#2087)
|
2023-05-15 19:51:23 -03:00 |
|
oobabooga
|
10869de0f4
|
Merge remote-tracking branch 'refs/remotes/origin/main'
|
2023-05-15 19:39:48 -03:00 |
|
oobabooga
|
c07215cc08
|
Improve the default Assistant character
|
2023-05-15 19:39:08 -03:00 |
|
oobabooga
|
4e66f68115
|
Create get_max_memory_dict() function
|
2023-05-15 19:38:27 -03:00 |
|
dependabot[bot]
|
ae54d83455
|
Bump transformers from 4.28.1 to 4.29.1 (#2089)
|
2023-05-15 19:25:24 -03:00 |
|
AlphaAtlas
|
071f0776ad
|
Add llama.cpp GPU offload option (#2060)
|
2023-05-14 22:58:11 -03:00 |
|
feeelX
|
eee986348c
|
Update llama-cpp-python from 0.1.45 to 0.1.50 (#2058)
|
2023-05-14 22:41:14 -03:00 |
|
oobabooga
|
897fa60069
|
Sort selected superbooga chunks by insertion order
For better coherence
|
2023-05-14 22:19:29 -03:00 |
|
Luis Lopez
|
b07f849e41
|
Add superbooga chunk separator option (#2051)
|
2023-05-14 21:44:52 -03:00 |
|
matatonic
|
ab08cf6465
|
[extensions/openai] clip extra leading space (#2042)
|
2023-05-14 12:57:52 -03:00 |
|
oobabooga
|
3b886f9c9f
|
Add chat-instruct mode (#2049)
|
2023-05-14 10:43:55 -03:00 |
|
oobabooga
|
5f6cf39f36
|
Change the injection context string
|
2023-05-13 14:23:02 -03:00 |
|
oobabooga
|
7cc17e3f1f
|
Refactor superbooga
|
2023-05-13 14:15:40 -03:00 |
|
oobabooga
|
826c74c201
|
Expand superbooga to instruct mode and change the chat implementation
|
2023-05-13 12:50:19 -03:00 |
|
oobabooga
|
c746a5bd00
|
Add .rstrip(' ') to openai api
|
2023-05-12 14:40:48 -03:00 |
|
Damian Stewart
|
3f1bfba718
|
Clarify how to start server.py with multimodal API support (#2025)
|
2023-05-12 14:37:49 -03:00 |
|
oobabooga
|
437d1c7ead
|
Fix bug in save_model_settings
|
2023-05-12 14:33:00 -03:00 |
|
oobabooga
|
146a9cb393
|
Allow superbooga to download URLs in parallel
|
2023-05-12 14:19:55 -03:00 |
|
oobabooga
|
df37ba5256
|
Update impersonate_wrapper
|
2023-05-12 12:59:48 -03:00 |
|
oobabooga
|
e283ddc559
|
Change how spaces are handled in continue/generation attempts
|
2023-05-12 12:50:29 -03:00 |
|
oobabooga
|
2eeb27659d
|
Fix bug in --cpu-memory
|
2023-05-12 06:17:07 -03:00 |
|
oobabooga
|
fcb46282c5
|
Add a rule to config.yaml
|
2023-05-12 06:11:58 -03:00 |
|
oobabooga
|
5eaa914e1b
|
Fix settings.json being ignored because of config.yaml
|
2023-05-12 06:09:45 -03:00 |
|
oobabooga
|
a77965e801
|
Make the regex for "Save settings for this model" exact
|
2023-05-12 00:43:13 -03:00 |
|
matatonic
|
f98fd01dcd
|
is_chat=False for /edits (#2011)
|
2023-05-11 19:15:11 -03:00 |
|
oobabooga
|
71693161eb
|
Better handle spaces in LlamaTokenizer
|
2023-05-11 17:55:50 -03:00 |
|
oobabooga
|
7221d1389a
|
Fix a bug
|
2023-05-11 17:11:10 -03:00 |
|
oobabooga
|
0d36c18f5d
|
Always return only the new tokens in generation functions
|
2023-05-11 17:07:20 -03:00 |
|
matatonic
|
c4f0e6d740
|
is_chat changes fix for openai extension (#2008)
|
2023-05-11 16:32:25 -03:00 |
|
oobabooga
|
394bb253db
|
Syntax improvement
|
2023-05-11 16:27:50 -03:00 |
|
oobabooga
|
f7dbddfff5
|
Add a variable for tts extensions to use
|
2023-05-11 16:12:46 -03:00 |
|
oobabooga
|
638c6a65a2
|
Refactor chat functions (#2003)
|
2023-05-11 15:37:04 -03:00 |
|
real
|
4e9da22c58
|
missing stream api port added to docker compose (#2005)
|
2023-05-11 15:07:56 -03:00 |
|
matatonic
|
309b72e549
|
[extension/openai] add edits & image endpoints & fix prompt return in non --chat modes (#1935)
|
2023-05-11 11:06:39 -03:00 |
|
oobabooga
|
23d3f6909a
|
Update README.md
|
2023-05-11 10:21:20 -03:00 |
|
oobabooga
|
400f3648f4
|
Update docs/README.md
|
2023-05-11 10:10:24 -03:00 |
|
oobabooga
|
2930e5a895
|
Update README.md
|
2023-05-11 10:04:38 -03:00 |
|
oobabooga
|
0ff38c994e
|
Update README.md
|
2023-05-11 09:58:58 -03:00 |
|
oobabooga
|
e6959a5d9a
|
Update README.md
|
2023-05-11 09:54:22 -03:00 |
|
oobabooga
|
dcfd09b61e
|
Update README.md
|
2023-05-11 09:49:57 -03:00 |
|
oobabooga
|
ac9a86a16c
|
Update llama.cpp-models.md
|
2023-05-11 09:47:36 -03:00 |
|
oobabooga
|
7a49ceab29
|
Update README.md
|
2023-05-11 09:42:39 -03:00 |
|
oobabooga
|
943b5e5f80
|
Minor bug fix
|
2023-05-10 23:54:25 -03:00 |
|