matatonic
|
9714072692
|
[extensions/openai] use instruction templates with chat_completions (#2291)
|
2023-05-23 19:58:41 -03:00 |
|
oobabooga
|
c0fd7f3257
|
Add mirostat parameters for llama.cpp (#2287)
|
2023-05-22 19:37:24 -03:00 |
|
oobabooga
|
8ac3636966
|
Add epsilon_cutoff/eta_cutoff parameters (#2258)
|
2023-05-21 15:11:57 -03:00 |
|
matatonic
|
78b2478d9c
|
assistant: space fix, system: prompt fix (#2219)
|
2023-05-20 23:32:34 -03:00 |
|
matatonic
|
ab08cf6465
|
[extensions/openai] clip extra leading space (#2042)
|
2023-05-14 12:57:52 -03:00 |
|
oobabooga
|
c746a5bd00
|
Add .rstrip(' ') to openai api
|
2023-05-12 14:40:48 -03:00 |
|
matatonic
|
f98fd01dcd
|
is_chat=False for /edits (#2011)
|
2023-05-11 19:15:11 -03:00 |
|
oobabooga
|
0d36c18f5d
|
Always return only the new tokens in generation functions
|
2023-05-11 17:07:20 -03:00 |
|
matatonic
|
c4f0e6d740
|
is_chat changes fix for openai extension (#2008)
|
2023-05-11 16:32:25 -03:00 |
|
matatonic
|
309b72e549
|
[extension/openai] add edits & image endpoints & fix prompt return in non --chat modes (#1935)
|
2023-05-11 11:06:39 -03:00 |
|
oobabooga
|
3913155c1f
|
Style improvements (#1957)
|
2023-05-09 22:49:39 -03:00 |
|
Jeffrey Lin
|
791a38bad1
|
[extensions/openai] Support undocumented base64 'encoding_format' param for compatibility with official OpenAI client (#1876)
|
2023-05-08 22:31:34 -03:00 |
|
oobabooga
|
8aafb1f796
|
Refactor text_generation.py, add support for custom generation functions (#1817)
|
2023-05-05 18:53:03 -03:00 |
|
Thireus ☠
|
4883e20fa7
|
Fix openai extension script.py - TypeError: '_Environ' object is not callable (#1753)
|
2023-05-03 09:51:49 -03:00 |
|
oobabooga
|
c31b0f15a7
|
Remove some spaces
|
2023-05-02 23:07:07 -03:00 |
|
oobabooga
|
320fcfde4e
|
Style/pep8 improvements
|
2023-05-02 23:05:38 -03:00 |
|
matatonic
|
7ac41b87df
|
add openai compatible api (#1475)
|
2023-05-02 22:49:53 -03:00 |
|