mirror of
https://github.com/ggerganov/llama.cpp.git
synced 2024-12-28 15:18:26 +01:00
78203641fe
* server : added with_pieces functionality to /tokenize endpoint * server : Add tokenize with pieces tests to server.feature * Handle case if tokenizer splits along utf8 continuation bytes * Add example of token splitting * Remove trailing ws * Fix trailing ws * Maybe fix ci * maybe this fix windows ci? --------- Co-authored-by: Xuan Son Nguyen <son@huggingface.co> |
||
---|---|---|
.. | ||
steps | ||
embeddings.feature | ||
environment.py | ||
issues.feature | ||
lora.feature | ||
parallel.feature | ||
passkey.feature | ||
results.feature | ||
security.feature | ||
server.feature | ||
slotsave.feature | ||
wrong_usages.feature |