ardfork 978ba3d83d
Server: Don't ignore llama.cpp params (#8754)
* Don't ignore llama.cpp params

* Add fallback for max_tokens
2024-08-04 20:16:23 +02:00
..
2024-07-05 19:01:35 +02:00
2023-03-29 20:21:09 +03:00