llama.cpp/.github
Xuan Son Nguyen 45abe0f74e
server : replace behave with pytest (#10416)
* server : replace behave with pytest

* fix test on windows

* misc

* add more tests

* more tests

* styling

* log less, fix embd test

* added all sequential tests

* fix coding style

* fix save slot test

* add parallel completion test

* fix parallel test

* remove feature files

* update test docs

* no cache_prompt for some tests

* add test_cache_vs_nocache_prompt
2024-11-26 16:20:18 +01:00
..
ISSUE_TEMPLATE Github: update issue templates [no ci] (#10489) 2024-11-25 19:18:37 +01:00
workflows server : replace behave with pytest (#10416) 2024-11-26 16:20:18 +01:00
labeler.yml ci : add ubuntu cuda build, build with one arch on windows (#10456) 2024-11-26 13:05:07 +01:00
pull_request_template.md github : update pr template 2024-06-16 10:46:51 +03:00