mirror of
https://github.com/ggerganov/llama.cpp.git
synced 2025-02-05 16:10:42 +01:00
server : (docs) added response format for /apply-template [no ci] (#11503)
This commit is contained in:
parent
7919256c57
commit
496e5bf46b
@ -584,6 +584,10 @@ Uses the server's prompt template formatting functionality to convert chat messa
|
|||||||
|
|
||||||
`messages`: (Required) Chat turns in the same format as `/v1/chat/completions`.
|
`messages`: (Required) Chat turns in the same format as `/v1/chat/completions`.
|
||||||
|
|
||||||
|
**Response format**
|
||||||
|
|
||||||
|
Returns a JSON object with a field `prompt` containing a string of the input messages formatted according to the model's chat template format.
|
||||||
|
|
||||||
### POST `/embedding`: Generate embedding of a given text
|
### POST `/embedding`: Generate embedding of a given text
|
||||||
|
|
||||||
> [!IMPORTANT]
|
> [!IMPORTANT]
|
||||||
|
Loading…
Reference in New Issue
Block a user