llama.cpp/ggml/src/ggml-vulkan
Johannes Gäßler fd08255d0d
CUDA: non-contiguous (RMS) norm support (#11659)
* CUDA: non-contiguous (RMS) norm support

---------

Co-authored-by: Georgi Gerganov <ggerganov@gmail.com>
2025-02-04 22:21:42 +01:00
..
cmake fix: ggml: fix vulkan-shaders-gen build (#10448) 2025-01-15 14:17:42 +01:00
vulkan-shaders vulkan: implement initial support for IQ2 and IQ3 quantizations (#11360) 2025-01-29 18:29:39 +01:00
CMakeLists.txt fix: ggml: fix vulkan-shaders-gen build (#10448) 2025-01-15 14:17:42 +01:00
ggml-vulkan.cpp CUDA: non-contiguous (RMS) norm support (#11659) 2025-02-04 22:21:42 +01:00