tdrussell
72f6fc6923
Rename additive_repetition_penalty to presence_penalty, add frequency_penalty ( #4376 )
2023-10-25 12:10:28 -03:00
tdrussell
4440f87722
Add additive_repetition_penalty sampler setting. ( #3627 )
2023-10-23 02:28:07 -03:00
oobabooga
08cf150c0c
Add a grammar editor to the UI ( #4061 )
2023-09-24 18:05:24 -03:00
oobabooga
b227e65d86
Add grammar to llama.cpp loader ( closes #4019 )
2023-09-24 07:10:45 -07:00
oobabooga
d43d150b1e
Fix a bug in the chat API ( closes #4034 )
2023-09-22 09:40:07 -07:00
oobabooga
d6814d7c15
Fix a bug in the API ( closes #4027 )
2023-09-21 17:54:53 -07:00
saltacc
f01b9aa71f
Add customizable ban tokens ( #3899 )
2023-09-15 18:27:27 -03:00
oobabooga
cec8db52e5
Add max_tokens_second param ( #3533 )
2023-08-29 17:44:31 -03:00
ausboss
a954b3e7de
fixes error when not specifying tunnel id ( #3606 )
2023-08-17 15:20:36 -03:00
oobabooga
a1a9ec895d
Unify the 3 interface modes ( #3554 )
2023-08-13 01:12:15 -03:00
Friedemann Lipphardt
901b028d55
Add option for named cloudflare tunnels ( #3364 )
2023-08-08 22:20:27 -03:00
jllllll
2cf64474f2
Use chat_instruct_command in API ( #3482 )
2023-08-06 23:46:25 -03:00
oobabooga
0af10ab49b
Add Classifier Free Guidance (CFG) for Transformers/ExLlama ( #3325 )
2023-08-06 17:22:48 -03:00
rafa-9
d578baeb2c
Use character settings from API properties if present ( #3428 )
2023-08-03 15:56:40 -03:00
oobabooga
e931844fe2
Add auto_max_new_tokens parameter ( #3419 )
2023-08-02 14:52:20 -03:00
oobabooga
ef8637e32d
Add extension example, replace input_hijack with chat_input_modifier ( #3307 )
2023-07-25 18:49:56 -03:00
oobabooga
e202190c4f
lint
2023-07-12 11:33:25 -07:00
atriantafy
d9fabdde40
Add context_instruct to API. Load default model instruction template … ( #2688 )
2023-07-12 00:01:03 -03:00
Chris Rude
70b088843d
fix for issue #2475 : Streaming api deadlock ( #3048 )
2023-07-08 23:21:20 -03:00
oobabooga
70a4d5dbcf
Update chat API ( fixes #3006 )
2023-07-04 17:36:47 -07:00
oobabooga
3443219cbc
Add repetition penalty range parameter to transformers ( #2916 )
2023-06-29 13:40:13 -03:00
oobabooga
c52290de50
ExLlama with long context ( #2875 )
2023-06-25 22:49:26 -03:00
oobabooga
134430bbe2
Minor change
2023-06-14 11:34:42 -03:00
oobabooga
474dc7355a
Allow API requests to use parameter presets
2023-06-14 11:32:20 -03:00
oobabooga
8e0a997c60
Add new parameters to API extension
2023-05-29 22:03:08 -03:00
Anthony K
7dc87984a2
Fix spelling mistake in new name var of chat api ( #2309 )
2023-05-23 23:03:03 -03:00
oobabooga
74aae34beb
Allow passing your name to the chat API
2023-05-23 19:39:18 -03:00
oobabooga
4d94a111d4
memoize load_character to speed up the chat API
2023-05-23 00:50:58 -03:00
oobabooga
c0fd7f3257
Add mirostat parameters for llama.cpp ( #2287 )
2023-05-22 19:37:24 -03:00
oobabooga
8ac3636966
Add epsilon_cutoff/eta_cutoff parameters ( #2258 )
2023-05-21 15:11:57 -03:00
oobabooga
c5af549d4b
Add chat API ( #2233 )
2023-05-20 18:42:17 -03:00
Wojtab
e9e75a9ec7
Generalize multimodality (llava/minigpt4 7b and 13b now supported) ( #1741 )
2023-05-09 20:18:02 -03:00
oobabooga
4bf7253ec5
Fix typing bug in api
2023-05-03 19:27:20 -03:00
oobabooga
68ed73dd89
Make API extension print its exceptions
2023-04-25 23:23:47 -03:00
Andy Salerno
654933c634
New universal API with streaming/blocking endpoints ( #990 )
...
Previous title: Add api_streaming extension and update api-example-stream to use it
* Merge with latest main
* Add parameter capturing encoder_repetition_penalty
* Change some defaults, minor fixes
* Add --api, --public-api flags
* remove unneeded/broken comment from blocking API startup. The comment is already correctly emitted in try_start_cloudflared by calling the lambda we pass in.
* Update on_start message for blocking_api, it should say 'non-streaming' and not 'streaming'
* Update the API examples
* Change a comment
* Update README
* Remove the gradio API
* Remove unused import
* Minor change
* Remove unused import
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-04-23 15:52:43 -03:00