Updated Templates supported by llama_chat_apply_template (markdown)

Xuan Son Nguyen 2024-03-04 12:23:08 +01:00
parent 5dee763ef3
commit 86f794b1f9

@ -133,7 +133,7 @@ response<|endoftext|>
## Custom chat templates
Currently, it's not possible to use your own chat template with `/chat/completions`
Currently, it's not possible to use your own chat template with llama.cpp server's `/chat/completions`
One of the possible solutions is use `/completions` endpoint instead, and write your own code (for example, using python) to apply a template before passing the final prompt to `/completions`