llama.cpp/ggml
matt23654 f922a9c542
[GGML][RPC] Support for models with non-512-aligned tensors over RPC. (#11047)
* Added init tensor calling code

* Added get_alloc_size forwarding

* Cleaned up and improved type/error handling.

* fix: remove trailing whitespaces.

* Cleanup and use GGML error logging functions.

* Handle potentially dangerous edge cases.

* Apply suggestions from code review

Co-authored-by: Diego Devesa <slarengh@gmail.com>

---------

Co-authored-by: Diego Devesa <slarengh@gmail.com>
2025-01-04 17:10:30 +01:00
..
include tts : add OuteTTS support (#10784) 2024-12-18 19:27:21 +02:00
src [GGML][RPC] Support for models with non-512-aligned tensors over RPC. (#11047) 2025-01-04 17:10:30 +01:00
.gitignore vulkan : cmake integration (#8119) 2024-07-13 18:12:39 +02:00
CMakeLists.txt ggml : do not install metal source when embed library (ggml/1054) 2025-01-04 16:09:53 +02:00