Elbios 0d4177126b
llava : fix memory management bug (#5491)
* Fix memory management in llava and server code

Fixes this error:

llama_new_context_with_model: graph splits (measure): 3
Available slots:
 -> Slot 0 - max context: 6000
{"timestamp":1707926446,"level":"INFO","function":"main","line":2623,"message":"model loaded"}
all slots are idle and system prompt is empty, clear the KV cache
slot 0 - loaded image
slot 0 is processing [task id: 0]
slot 0 : kv cache rm - [0, end)
slot 0 - encoding image [id: 1]
munmap_chunk(): invalid pointer
Aborted

* Make it cleaner by checking size in batch free wrapper
2024-02-15 10:01:57 +02:00
..
2024-01-14 09:45:56 +02:00
2024-02-12 09:16:06 +02:00
2023-12-21 23:08:14 +02:00
2024-02-04 10:39:58 +02:00
2023-03-29 20:21:09 +03:00
2023-08-30 09:29:32 +03:00