llama.cpp/examples/server/webui
stduhpf c07e87f38b
server : (webui) put DeepSeek R1 CoT in a collapsible <details> element (#11364)
* webui : put DeepSeek R1 CoT in a collapsible <details> element

* webui: refactor split

* webui: don't use regex to split cot and response

* webui: format+qol

* webui: no loading icon if the model isn't generating

* ui fix, add configs

* add jsdoc types

* only filter </think> for assistant msg

* build

* update build

---------

Co-authored-by: Xuan Son Nguyen <son@huggingface.co>
2025-01-24 09:02:38 +01:00
..
public server: (UI) add syntax highlighting and latex math rendering (#10808) 2024-12-15 12:55:54 +01:00
src server : (webui) put DeepSeek R1 CoT in a collapsible <details> element (#11364) 2025-01-24 09:02:38 +01:00
index.html server : (webui) put DeepSeek R1 CoT in a collapsible <details> element (#11364) 2025-01-24 09:02:38 +01:00
package-lock.json server : (UI) fix missing async generator on safari (#10857) 2024-12-17 09:52:09 +01:00
package.json server : (UI) fix missing async generator on safari (#10857) 2024-12-17 09:52:09 +01:00
postcss.config.js server : (web ui) Various improvements, now use vite as bundler (#10599) 2024-12-03 19:38:44 +01:00
tailwind.config.js server : (web ui) Various improvements, now use vite as bundler (#10599) 2024-12-03 19:38:44 +01:00
vite.config.js server: (UI) add syntax highlighting and latex math rendering (#10808) 2024-12-15 12:55:54 +01:00