mirror of
https://github.com/ggerganov/llama.cpp.git
synced 2024-11-22 08:17:58 +01:00
Updated Templates supported by llama_chat_apply_template (markdown)
parent
5dee763ef3
commit
86f794b1f9
@ -133,7 +133,7 @@ response<|endoftext|>
|
||||
|
||||
## Custom chat templates
|
||||
|
||||
Currently, it's not possible to use your own chat template with `/chat/completions`
|
||||
Currently, it's not possible to use your own chat template with llama.cpp server's `/chat/completions`
|
||||
|
||||
One of the possible solutions is use `/completions` endpoint instead, and write your own code (for example, using python) to apply a template before passing the final prompt to `/completions`
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user