llama.cpp/examples/server/public
stduhpf c07e87f38b
server : (webui) put DeepSeek R1 CoT in a collapsible <details> element (#11364)
* webui : put DeepSeek R1 CoT in a collapsible <details> element

* webui: refactor split

* webui: don't use regex to split cot and response

* webui: format+qol

* webui: no loading icon if the model isn't generating

* ui fix, add configs

* add jsdoc types

* only filter </think> for assistant msg

* build

* update build

---------

Co-authored-by: Xuan Son Nguyen <son@huggingface.co>
2025-01-24 09:02:38 +01:00
..
index.html.gz server : (webui) put DeepSeek R1 CoT in a collapsible <details> element (#11364) 2025-01-24 09:02:38 +01:00
loading.html server : add loading html page while model is loading (#9468) 2024-09-13 14:23:11 +02:00