diff --git a/Templates-supported-by-llama_chat_apply_template.md b/Templates-supported-by-llama_chat_apply_template.md index 7f392a1..d2cefb4 100644 --- a/Templates-supported-by-llama_chat_apply_template.md +++ b/Templates-supported-by-llama_chat_apply_template.md @@ -116,6 +116,15 @@ again response ``` +``` +Usage: ./server -m ... --chat-template orion +Human: hello + +Assistant: responseHuman: again + +Assistant: response +``` + Additionally, we also support zephyr template (I cannot find it on huggingface, but have seen in [this list](https://github.com/ggerganov/llama.cpp/blob/c8d847d57efdc0f9bbbf881d48c645e151b36fd8/examples/server/public/promptFormats.js) ) ``` @@ -132,15 +141,6 @@ again<|endoftext|> response<|endoftext|> ``` -``` -Usage: ./server -m ... --chat-template orion -Human: hello - -Assistant: responseHuman: again - -Assistant: response -``` - ## Custom chat templates Currently, it's not possible to use your own chat template with llama.cpp server's `/chat/completions`