This website requires JavaScript.
Explore
Help
Register
Sign In
Mirrors
/
llama.cpp
Watch
1
Star
0
Fork
0
You've already forked llama.cpp
mirror of
https://github.com/ggerganov/llama.cpp.git
synced
2025-01-13 05:42:22 +01:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
llama.cpp
/
examples
/
server
/
tests
/
features
History
Olivier Chafik
f298cc63d2
server: update refs -> llama-server
...
gitignore llama-server
2024-06-06 15:44:40 +01:00
..
steps
server: update refs -> llama-server
2024-06-06 15:44:40 +01:00
embeddings.feature
Improve usability of --model-url & related flags (
#6930
)
2024-04-30 00:52:50 +01:00
environment.py
server tests : more pythonic process management; fix bare
except:
(
#6146
)
2024-03-20 06:33:49 +01:00
issues.feature
server: tests: passkey challenge / self-extend with context shift demo (
#5832
)
2024-03-02 22:00:14 +01:00
parallel.feature
common: llama_load_model_from_url split support (
#6192
)
2024-03-23 18:07:00 +01:00
passkey.feature
server: tests: passkey challenge / self-extend with context shift demo (
#5832
)
2024-03-02 22:00:14 +01:00
results.feature
server : fix temperature + disable some tests (
#7409
)
2024-05-20 22:10:03 +10:00
security.feature
json-schema-to-grammar improvements (+ added to server) (
#5978
)
2024-03-21 11:50:43 +00:00
server.feature
Tokenizer SPM fixes for phi-3 and llama-spm (bugfix) (
#7425
)
2024-05-21 14:39:48 +02:00
slotsave.feature
Tokenizer SPM fixes for phi-3 and llama-spm (bugfix) (
#7425
)
2024-05-21 14:39:48 +02:00
wrong_usages.feature
server: tests: passkey challenge / self-extend with context shift demo (
#5832
)
2024-03-02 22:00:14 +01:00