Commit Graph

1762 Commits

Author SHA1 Message Date
oobabooga
5f6cf39f36 Change the injection context string 2023-05-13 14:23:02 -03:00
oobabooga
7cc17e3f1f Refactor superbooga 2023-05-13 14:15:40 -03:00
oobabooga
826c74c201 Expand superbooga to instruct mode and change the chat implementation 2023-05-13 12:50:19 -03:00
oobabooga
c746a5bd00 Add .rstrip(' ') to openai api 2023-05-12 14:40:48 -03:00
Damian Stewart
3f1bfba718
Clarify how to start server.py with multimodal API support (#2025) 2023-05-12 14:37:49 -03:00
oobabooga
437d1c7ead Fix bug in save_model_settings 2023-05-12 14:33:00 -03:00
oobabooga
146a9cb393 Allow superbooga to download URLs in parallel 2023-05-12 14:19:55 -03:00
oobabooga
df37ba5256 Update impersonate_wrapper 2023-05-12 12:59:48 -03:00
oobabooga
e283ddc559 Change how spaces are handled in continue/generation attempts 2023-05-12 12:50:29 -03:00
oobabooga
2eeb27659d Fix bug in --cpu-memory 2023-05-12 06:17:07 -03:00
oobabooga
fcb46282c5 Add a rule to config.yaml 2023-05-12 06:11:58 -03:00
oobabooga
5eaa914e1b Fix settings.json being ignored because of config.yaml 2023-05-12 06:09:45 -03:00
oobabooga
a77965e801 Make the regex for "Save settings for this model" exact 2023-05-12 00:43:13 -03:00
matatonic
f98fd01dcd
is_chat=False for /edits (#2011) 2023-05-11 19:15:11 -03:00
oobabooga
71693161eb Better handle spaces in LlamaTokenizer 2023-05-11 17:55:50 -03:00
oobabooga
7221d1389a Fix a bug 2023-05-11 17:11:10 -03:00
oobabooga
0d36c18f5d Always return only the new tokens in generation functions 2023-05-11 17:07:20 -03:00
matatonic
c4f0e6d740
is_chat changes fix for openai extension (#2008) 2023-05-11 16:32:25 -03:00
oobabooga
394bb253db Syntax improvement 2023-05-11 16:27:50 -03:00
oobabooga
f7dbddfff5 Add a variable for tts extensions to use 2023-05-11 16:12:46 -03:00
oobabooga
638c6a65a2
Refactor chat functions (#2003) 2023-05-11 15:37:04 -03:00
real
4e9da22c58
missing stream api port added to docker compose (#2005) 2023-05-11 15:07:56 -03:00
matatonic
309b72e549
[extension/openai] add edits & image endpoints & fix prompt return in non --chat modes (#1935) 2023-05-11 11:06:39 -03:00
oobabooga
23d3f6909a
Update README.md 2023-05-11 10:21:20 -03:00
oobabooga
400f3648f4
Update docs/README.md 2023-05-11 10:10:24 -03:00
oobabooga
2930e5a895
Update README.md 2023-05-11 10:04:38 -03:00
oobabooga
0ff38c994e
Update README.md 2023-05-11 09:58:58 -03:00
oobabooga
e6959a5d9a
Update README.md 2023-05-11 09:54:22 -03:00
oobabooga
dcfd09b61e
Update README.md 2023-05-11 09:49:57 -03:00
oobabooga
ac9a86a16c
Update llama.cpp-models.md 2023-05-11 09:47:36 -03:00
oobabooga
7a49ceab29
Update README.md 2023-05-11 09:42:39 -03:00
oobabooga
943b5e5f80 Minor bug fix 2023-05-10 23:54:25 -03:00
oobabooga
9695bfe117 Add an option for cleaning up html in superbooga 2023-05-10 23:51:52 -03:00
oobabooga
c7ba2d4f3f Change a message in download-model.py 2023-05-10 19:00:14 -03:00
oobabooga
b7a589afc8 Improve the Metharme prompt 2023-05-10 16:09:32 -03:00
oobabooga
e5b1547849 Fix reload model button 2023-05-10 14:44:25 -03:00
oobabooga
b01c4884cb Better stopping strings for instruct mode 2023-05-10 14:22:38 -03:00
oobabooga
6a4783afc7 Add markdown table rendering 2023-05-10 13:41:23 -03:00
oobabooga
57dc44a995
Update README.md 2023-05-10 12:48:25 -03:00
oobabooga
f5592781e5
Update README.md 2023-05-10 12:19:56 -03:00
oobabooga
f1d10edcb7
Update README.md 2023-05-10 12:13:14 -03:00
oobabooga
181b102521
Update README.md 2023-05-10 12:09:47 -03:00
oobabooga
3316e33d14 Remove unused code 2023-05-10 11:59:59 -03:00
Alexander Dibrov
ec14d9b725
Fix custom_generate_chat_prompt (#1965) 2023-05-10 11:29:59 -03:00
oobabooga
32481ec4d6 Fix prompt order in the dropdown 2023-05-10 02:24:09 -03:00
oobabooga
dfd9ba3e90 Remove duplicate code 2023-05-10 02:07:22 -03:00
oobabooga
cd36b8f739 Remove space 2023-05-10 01:41:33 -03:00
oobabooga
c35860ff2f Add a link to silero samples 2023-05-10 01:39:35 -03:00
oobabooga
bdf1274b5d Remove duplicate code 2023-05-10 01:34:04 -03:00
oobabooga
ba445cf59f Fix some galactica templates 2023-05-09 22:58:59 -03:00