From e4f77876f97ee8694844bd307c3233345b7d16b4 Mon Sep 17 00:00:00 2001 From: Xuan Son Nguyen Date: Thu, 14 Mar 2024 22:21:55 +0100 Subject: [PATCH] Updated Templates supported by llama_chat_apply_template (markdown) --- ...s-supported-by-llama_chat_apply_template.md | 18 +++++++++--------- 1 file changed, 9 insertions(+), 9 deletions(-) diff --git a/Templates-supported-by-llama_chat_apply_template.md b/Templates-supported-by-llama_chat_apply_template.md index 7f392a1..d2cefb4 100644 --- a/Templates-supported-by-llama_chat_apply_template.md +++ b/Templates-supported-by-llama_chat_apply_template.md @@ -116,6 +116,15 @@ again response ``` +``` +Usage: ./server -m ... --chat-template orion +Human: hello + +Assistant: responseHuman: again + +Assistant: response +``` + Additionally, we also support zephyr template (I cannot find it on huggingface, but have seen in [this list](https://github.com/ggerganov/llama.cpp/blob/c8d847d57efdc0f9bbbf881d48c645e151b36fd8/examples/server/public/promptFormats.js) ) ``` @@ -132,15 +141,6 @@ again<|endoftext|> response<|endoftext|> ``` -``` -Usage: ./server -m ... --chat-template orion -Human: hello - -Assistant: responseHuman: again - -Assistant: response -``` - ## Custom chat templates Currently, it's not possible to use your own chat template with llama.cpp server's `/chat/completions`