mirror of
https://github.com/ggerganov/llama.cpp.git
synced 2024-11-22 08:17:58 +01:00
Updated Templates supported by llama_chat_apply_template (markdown)
parent
420b3fdda3
commit
e4f77876f9
@ -116,6 +116,15 @@ again<end_of_turn>
|
||||
response<end_of_turn>
|
||||
```
|
||||
|
||||
```
|
||||
Usage: ./server -m ... --chat-template orion
|
||||
<s>Human: hello
|
||||
|
||||
Assistant: </s>response</s>Human: again
|
||||
|
||||
Assistant: </s>response</s>
|
||||
```
|
||||
|
||||
Additionally, we also support zephyr template (I cannot find it on huggingface, but have seen in [this list](https://github.com/ggerganov/llama.cpp/blob/c8d847d57efdc0f9bbbf881d48c645e151b36fd8/examples/server/public/promptFormats.js) )
|
||||
|
||||
```
|
||||
@ -132,15 +141,6 @@ again<|endoftext|>
|
||||
response<|endoftext|>
|
||||
```
|
||||
|
||||
```
|
||||
Usage: ./server -m ... --chat-template orion
|
||||
<s>Human: hello
|
||||
|
||||
Assistant: </s>response</s>Human: again
|
||||
|
||||
Assistant: </s>response</s>
|
||||
```
|
||||
|
||||
## Custom chat templates
|
||||
|
||||
Currently, it's not possible to use your own chat template with llama.cpp server's `/chat/completions`
|
||||
|
Loading…
Reference in New Issue
Block a user