jllllll
2cf64474f2
Use chat_instruct_command in API ( #3482 )
2023-08-06 23:46:25 -03:00
oobabooga
0af10ab49b
Add Classifier Free Guidance (CFG) for Transformers/ExLlama ( #3325 )
2023-08-06 17:22:48 -03:00
rafa-9
d578baeb2c
Use character settings from API properties if present ( #3428 )
2023-08-03 15:56:40 -03:00
oobabooga
e931844fe2
Add auto_max_new_tokens parameter ( #3419 )
2023-08-02 14:52:20 -03:00
oobabooga
ef8637e32d
Add extension example, replace input_hijack with chat_input_modifier ( #3307 )
2023-07-25 18:49:56 -03:00
oobabooga
e202190c4f
lint
2023-07-12 11:33:25 -07:00
atriantafy
d9fabdde40
Add context_instruct to API. Load default model instruction template … ( #2688 )
2023-07-12 00:01:03 -03:00
Chris Rude
70b088843d
fix for issue #2475 : Streaming api deadlock ( #3048 )
2023-07-08 23:21:20 -03:00
oobabooga
70a4d5dbcf
Update chat API ( fixes #3006 )
2023-07-04 17:36:47 -07:00
oobabooga
3443219cbc
Add repetition penalty range parameter to transformers ( #2916 )
2023-06-29 13:40:13 -03:00
oobabooga
c52290de50
ExLlama with long context ( #2875 )
2023-06-25 22:49:26 -03:00
rizerphe
77baf43f6d
Add CORS support to the API ( #2718 )
2023-06-24 10:16:06 -03:00
oobabooga
7ef6a50e84
Reorganize model loading UI completely ( #2720 )
2023-06-16 19:00:37 -03:00
oobabooga
134430bbe2
Minor change
2023-06-14 11:34:42 -03:00
oobabooga
474dc7355a
Allow API requests to use parameter presets
2023-06-14 11:32:20 -03:00
oobabooga
c6552785af
Minor cleanup
2023-06-09 00:30:22 -03:00
matatonic
7be6fe126b
extensions/api: models api for blocking_api (updated) ( #2539 )
2023-06-08 11:34:36 -03:00
oobabooga
d183c7d29e
Fix streaming japanese/chinese characters
...
Credits to matasonic for the idea
2023-06-02 02:09:52 -03:00
Yiximail
4715123f55
Add a /api/v1/stop-stream
API that allows the user to interrupt the generation ( #2392 )
2023-05-30 22:03:40 -03:00
oobabooga
8e0a997c60
Add new parameters to API extension
2023-05-29 22:03:08 -03:00
Anthony K
7dc87984a2
Fix spelling mistake in new name var of chat api ( #2309 )
2023-05-23 23:03:03 -03:00
oobabooga
74aae34beb
Allow passing your name to the chat API
2023-05-23 19:39:18 -03:00
oobabooga
4d94a111d4
memoize load_character to speed up the chat API
2023-05-23 00:50:58 -03:00
oobabooga
c0fd7f3257
Add mirostat parameters for llama.cpp ( #2287 )
2023-05-22 19:37:24 -03:00
oobabooga
8ac3636966
Add epsilon_cutoff/eta_cutoff parameters ( #2258 )
2023-05-21 15:11:57 -03:00
oobabooga
c5af549d4b
Add chat API ( #2233 )
2023-05-20 18:42:17 -03:00
atriantafy
26cf8c2545
add api port options ( #1990 )
2023-05-15 20:44:16 -03:00
oobabooga
0d36c18f5d
Always return only the new tokens in generation functions
2023-05-11 17:07:20 -03:00
oobabooga
638c6a65a2
Refactor chat functions ( #2003 )
2023-05-11 15:37:04 -03:00
oobabooga
3913155c1f
Style improvements ( #1957 )
2023-05-09 22:49:39 -03:00
Wojtab
e9e75a9ec7
Generalize multimodality (llava/minigpt4 7b and 13b now supported) ( #1741 )
2023-05-09 20:18:02 -03:00
oobabooga
8aafb1f796
Refactor text_generation.py, add support for custom generation functions ( #1817 )
2023-05-05 18:53:03 -03:00
oobabooga
4bf7253ec5
Fix typing bug in api
2023-05-03 19:27:20 -03:00
oobabooga
88cdf6ed3d
Prevent websocket from disconnecting
2023-05-02 19:03:19 -03:00
oobabooga
68ed73dd89
Make API extension print its exceptions
2023-04-25 23:23:47 -03:00
MajdajkD
c86e9a3372
fix websocket batching ( #1511 )
2023-04-24 03:51:32 -03:00
Andy Salerno
654933c634
New universal API with streaming/blocking endpoints ( #990 )
...
Previous title: Add api_streaming extension and update api-example-stream to use it
* Merge with latest main
* Add parameter capturing encoder_repetition_penalty
* Change some defaults, minor fixes
* Add --api, --public-api flags
* remove unneeded/broken comment from blocking API startup. The comment is already correctly emitted in try_start_cloudflared by calling the lambda we pass in.
* Update on_start message for blocking_api, it should say 'non-streaming' and not 'streaming'
* Update the API examples
* Change a comment
* Update README
* Remove the gradio API
* Remove unused import
* Minor change
* Remove unused import
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-04-23 15:52:43 -03:00
oobabooga
7ff645899e
Fix bug in api extension
2023-04-22 17:33:36 -03:00
AICatgirls
b992c9236a
Prevent API extension responses from getting cut off with --chat enabled ( #1467 )
2023-04-22 16:06:43 -03:00
oobabooga
ff0d0ac552
Api extension bug fix
2023-04-20 13:26:58 -03:00
oobabooga
96ba55501c
Rename custom_stopping_strings in the api extension
2023-04-20 00:15:32 -03:00
oobabooga
b0c762ceba
Revert a change
...
I think that this may be needed for some clients
2023-04-18 04:10:45 -03:00
oobabooga
163ea295e7
Fix but in API extension
2023-04-17 13:54:15 -03:00
oobabooga
b937c9d8c2
Add skip_special_tokens checkbox for Dolly model ( #1218 )
2023-04-16 14:24:49 -03:00
Tymec
832ee4323d
API: add endpoint for counting tokens ( #1051 )
2023-04-11 23:08:42 -03:00
Alexander01998
61641a4551
Add missing new parameters to API extension
2023-04-11 22:41:13 -03:00
oobabooga
dc3c9d00a0
Update the API extension
2023-04-11 13:07:45 -03:00
oobabooga
64f5c90ee7
Fix the API extension
2023-04-10 20:14:38 -03:00
oobabooga
ea6e77df72
Make the code more like PEP8 for readability ( #862 )
2023-04-07 00:15:45 -03:00
oobabooga
3f3e42e26c
Refactor several function calls and the API
2023-04-06 01:22:15 -03:00