mirror of
https://github.com/ggerganov/llama.cpp.git
synced 2024-12-27 06:39:25 +01:00
78203641fe
* server : added with_pieces functionality to /tokenize endpoint * server : Add tokenize with pieces tests to server.feature * Handle case if tokenizer splits along utf8 continuation bytes * Add example of token splitting * Remove trailing ws * Fix trailing ws * Maybe fix ci * maybe this fix windows ci? --------- Co-authored-by: Xuan Son Nguyen <son@huggingface.co> |
||
---|---|---|
.. | ||
bench.yml.disabled | ||
build.yml | ||
close-issue.yml | ||
docker.yml | ||
editorconfig.yml | ||
gguf-publish.yml | ||
labeler.yml | ||
nix-ci-aarch64.yml | ||
nix-ci.yml | ||
nix-flake-update.yml | ||
nix-publish-flake.yml | ||
python-check-requirements.yml | ||
python-lint.yml | ||
python-type-check.yml | ||
server.yml |