.. |
nix
|
nix: removed unnessesary indentation
|
2024-03-28 07:48:27 +00:00 |
cloud-v-pipeline
|
ci : Cloud-V for RISC-V builds (#3160)
|
2023-09-15 11:06:56 +03:00 |
full-cuda.Dockerfile
|
cuda : rename build flag to LLAMA_CUDA (#6299)
|
2024-03-26 01:16:01 +01:00 |
full-rocm.Dockerfile
|
python : add check-requirements.sh and GitHub workflow (#4585)
|
2023-12-29 16:50:29 +02:00 |
full.Dockerfile
|
python : add check-requirements.sh and GitHub workflow (#4585)
|
2023-12-29 16:50:29 +02:00 |
llama-cpp-clblast.srpm.spec
|
Fedora build update (#6388)
|
2024-03-29 22:59:56 +01:00 |
llama-cpp-cuda.srpm.spec
|
Fedora build update (#6388)
|
2024-03-29 22:59:56 +01:00 |
llama-cpp.srpm.spec
|
Fedora build update (#6388)
|
2024-03-29 22:59:56 +01:00 |
main-cuda.Dockerfile
|
cuda : rename build flag to LLAMA_CUDA (#6299)
|
2024-03-26 01:16:01 +01:00 |
main-intel.Dockerfile
|
docker : add build for SYCL, Vulkan + update readme (#5228)
|
2024-02-02 09:56:31 +02:00 |
main-rocm.Dockerfile
|
python : add check-requirements.sh and GitHub workflow (#4585)
|
2023-12-29 16:50:29 +02:00 |
main-vulkan.Dockerfile
|
docker : add build for SYCL, Vulkan + update readme (#5228)
|
2024-02-02 09:56:31 +02:00 |
main.Dockerfile
|
Add llama.cpp docker support for non-latin languages (#1673)
|
2023-06-08 00:58:53 -07:00 |
server-cuda.Dockerfile
|
cuda : rename build flag to LLAMA_CUDA (#6299)
|
2024-03-26 01:16:01 +01:00 |
server-intel.Dockerfile
|
docker : add build for SYCL, Vulkan + update readme (#5228)
|
2024-02-02 09:56:31 +02:00 |
server-rocm.Dockerfile
|
docker : add server-first container images (#5157)
|
2024-01-28 09:55:31 +02:00 |
server-vulkan.Dockerfile
|
docker : add build for SYCL, Vulkan + update readme (#5228)
|
2024-02-02 09:56:31 +02:00 |
server.Dockerfile
|
server: add cURL support to server.Dockerfile (#6461)
|
2024-04-03 19:56:37 +02:00 |
tools.sh
|
docker : add finetune option (#4211)
|
2023-11-30 23:46:01 +02:00 |