.. |
build.yml
|
cuda : rename build flag to LLAMA_CUDA (#6299)
|
2024-03-26 01:16:01 +01:00 |
close-issue.yml
|
ci : close inactive issue, increase operations per run (#6270)
|
2024-03-24 10:57:06 +02:00 |
code-coverage.yml
|
ci: apply concurrency limit for github workflows (#6243)
|
2024-03-22 19:15:06 +02:00 |
docker.yml
|
ci: apply concurrency limit for github workflows (#6243)
|
2024-03-22 19:15:06 +02:00 |
editorconfig.yml
|
ci: apply concurrency limit for github workflows (#6243)
|
2024-03-22 19:15:06 +02:00 |
gguf-publish.yml
|
gguf.py : fix CI for publishing GGUF package (#3532)
|
2023-10-07 22:14:10 +03:00 |
nix-ci-aarch64.yml
|
ci: apply concurrency limit for github workflows (#6243)
|
2024-03-22 19:15:06 +02:00 |
nix-ci.yml
|
ci: apply concurrency limit for github workflows (#6243)
|
2024-03-22 19:15:06 +02:00 |
nix-flake-update.yml
|
ci: nix-flake-update: new token with pr permissions (#4879)
|
2024-01-11 17:22:34 +00:00 |
nix-publish-flake.yml
|
workflows: nix-flakestry: drop tag filters
|
2023-12-31 13:14:58 -08:00 |
python-check-requirements.yml
|
ci: apply concurrency limit for github workflows (#6243)
|
2024-03-22 19:15:06 +02:00 |
python-lint.yml
|
ci: apply concurrency limit for github workflows (#6243)
|
2024-03-22 19:15:06 +02:00 |
server.yml
|
common: llama_load_model_from_url split support (#6192)
|
2024-03-23 18:07:00 +01:00 |
zig-build.yml
|
ci: apply concurrency limit for github workflows (#6243)
|
2024-03-22 19:15:06 +02:00 |