oobabooga
|
4e66f68115
|
Create get_max_memory_dict() function
|
2023-05-15 19:38:27 -03:00 |
|
dependabot[bot]
|
ae54d83455
|
Bump transformers from 4.28.1 to 4.29.1 (#2089)
|
2023-05-15 19:25:24 -03:00 |
|
AlphaAtlas
|
071f0776ad
|
Add llama.cpp GPU offload option (#2060)
|
2023-05-14 22:58:11 -03:00 |
|
feeelX
|
eee986348c
|
Update llama-cpp-python from 0.1.45 to 0.1.50 (#2058)
|
2023-05-14 22:41:14 -03:00 |
|
oobabooga
|
897fa60069
|
Sort selected superbooga chunks by insertion order
For better coherence
|
2023-05-14 22:19:29 -03:00 |
|
Luis Lopez
|
b07f849e41
|
Add superbooga chunk separator option (#2051)
|
2023-05-14 21:44:52 -03:00 |
|
matatonic
|
ab08cf6465
|
[extensions/openai] clip extra leading space (#2042)
|
2023-05-14 12:57:52 -03:00 |
|
oobabooga
|
3b886f9c9f
|
Add chat-instruct mode (#2049)
|
2023-05-14 10:43:55 -03:00 |
|
oobabooga
|
5f6cf39f36
|
Change the injection context string
|
2023-05-13 14:23:02 -03:00 |
|
oobabooga
|
7cc17e3f1f
|
Refactor superbooga
|
2023-05-13 14:15:40 -03:00 |
|
oobabooga
|
826c74c201
|
Expand superbooga to instruct mode and change the chat implementation
|
2023-05-13 12:50:19 -03:00 |
|
oobabooga
|
c746a5bd00
|
Add .rstrip(' ') to openai api
|
2023-05-12 14:40:48 -03:00 |
|
Damian Stewart
|
3f1bfba718
|
Clarify how to start server.py with multimodal API support (#2025)
|
2023-05-12 14:37:49 -03:00 |
|
oobabooga
|
437d1c7ead
|
Fix bug in save_model_settings
|
2023-05-12 14:33:00 -03:00 |
|
oobabooga
|
146a9cb393
|
Allow superbooga to download URLs in parallel
|
2023-05-12 14:19:55 -03:00 |
|
oobabooga
|
df37ba5256
|
Update impersonate_wrapper
|
2023-05-12 12:59:48 -03:00 |
|
oobabooga
|
e283ddc559
|
Change how spaces are handled in continue/generation attempts
|
2023-05-12 12:50:29 -03:00 |
|
oobabooga
|
2eeb27659d
|
Fix bug in --cpu-memory
|
2023-05-12 06:17:07 -03:00 |
|
oobabooga
|
fcb46282c5
|
Add a rule to config.yaml
|
2023-05-12 06:11:58 -03:00 |
|
oobabooga
|
5eaa914e1b
|
Fix settings.json being ignored because of config.yaml
|
2023-05-12 06:09:45 -03:00 |
|
oobabooga
|
a77965e801
|
Make the regex for "Save settings for this model" exact
|
2023-05-12 00:43:13 -03:00 |
|
matatonic
|
f98fd01dcd
|
is_chat=False for /edits (#2011)
|
2023-05-11 19:15:11 -03:00 |
|
oobabooga
|
71693161eb
|
Better handle spaces in LlamaTokenizer
|
2023-05-11 17:55:50 -03:00 |
|
oobabooga
|
7221d1389a
|
Fix a bug
|
2023-05-11 17:11:10 -03:00 |
|
oobabooga
|
0d36c18f5d
|
Always return only the new tokens in generation functions
|
2023-05-11 17:07:20 -03:00 |
|
matatonic
|
c4f0e6d740
|
is_chat changes fix for openai extension (#2008)
|
2023-05-11 16:32:25 -03:00 |
|
oobabooga
|
394bb253db
|
Syntax improvement
|
2023-05-11 16:27:50 -03:00 |
|
oobabooga
|
f7dbddfff5
|
Add a variable for tts extensions to use
|
2023-05-11 16:12:46 -03:00 |
|
oobabooga
|
638c6a65a2
|
Refactor chat functions (#2003)
|
2023-05-11 15:37:04 -03:00 |
|
real
|
4e9da22c58
|
missing stream api port added to docker compose (#2005)
|
2023-05-11 15:07:56 -03:00 |
|
matatonic
|
309b72e549
|
[extension/openai] add edits & image endpoints & fix prompt return in non --chat modes (#1935)
|
2023-05-11 11:06:39 -03:00 |
|
oobabooga
|
23d3f6909a
|
Update README.md
|
2023-05-11 10:21:20 -03:00 |
|
oobabooga
|
400f3648f4
|
Update docs/README.md
|
2023-05-11 10:10:24 -03:00 |
|
oobabooga
|
2930e5a895
|
Update README.md
|
2023-05-11 10:04:38 -03:00 |
|
oobabooga
|
0ff38c994e
|
Update README.md
|
2023-05-11 09:58:58 -03:00 |
|
oobabooga
|
e6959a5d9a
|
Update README.md
|
2023-05-11 09:54:22 -03:00 |
|
oobabooga
|
dcfd09b61e
|
Update README.md
|
2023-05-11 09:49:57 -03:00 |
|
oobabooga
|
ac9a86a16c
|
Update llama.cpp-models.md
|
2023-05-11 09:47:36 -03:00 |
|
oobabooga
|
7a49ceab29
|
Update README.md
|
2023-05-11 09:42:39 -03:00 |
|
oobabooga
|
943b5e5f80
|
Minor bug fix
|
2023-05-10 23:54:25 -03:00 |
|
oobabooga
|
9695bfe117
|
Add an option for cleaning up html in superbooga
|
2023-05-10 23:51:52 -03:00 |
|
oobabooga
|
c7ba2d4f3f
|
Change a message in download-model.py
|
2023-05-10 19:00:14 -03:00 |
|
oobabooga
|
1309cdd257
|
Add a space
|
2023-05-10 18:03:12 -03:00 |
|
oobabooga
|
3e19733d35
|
Remove obsolete comment
|
2023-05-10 18:01:04 -03:00 |
|
oobabooga
|
4ab5deeea0
|
Update INSTRUCTIONS.TXT
|
2023-05-10 18:00:37 -03:00 |
|
oobabooga
|
d7d3f7f31c
|
Add a "CMD_FLAGS" variable
|
2023-05-10 17:54:12 -03:00 |
|
oobabooga
|
b7a589afc8
|
Improve the Metharme prompt
|
2023-05-10 16:09:32 -03:00 |
|
oobabooga
|
e5b1547849
|
Fix reload model button
|
2023-05-10 14:44:25 -03:00 |
|
oobabooga
|
b01c4884cb
|
Better stopping strings for instruct mode
|
2023-05-10 14:22:38 -03:00 |
|
oobabooga
|
6a4783afc7
|
Add markdown table rendering
|
2023-05-10 13:41:23 -03:00 |
|