1
0
mirror of https://github.com/ggerganov/llama.cpp.git synced 2025-01-22 09:39:08 +01:00
llama.cpp/prompts
CRD716 b608b55a3e
prompts : model agnostic DAN ()
* add model-agnostic dan prompt

* quick readme update

* save a token

* Revert "quick readme update"

This reverts commit 8dc342c069.
2023-05-11 18:10:19 +03:00
..
alpaca.txt Revert "main : alternative instruct mode (Vicuna support, etc.) ()" () 2023-04-14 22:58:43 +03:00
chat-with-bob.txt Revert "main : alternative instruct mode (Vicuna support, etc.) ()" () 2023-04-14 22:58:43 +03:00
chat-with-vicuna-v0.txt examples : read chat prompts from a template file () 2023-05-03 20:58:11 +03:00
chat-with-vicuna-v1.txt examples : read chat prompts from a template file () 2023-05-03 20:58:11 +03:00
chat.txt examples : read chat prompts from a template file () 2023-05-03 20:58:11 +03:00
dan-modified.txt prompts : model agnostic DAN () 2023-05-11 18:10:19 +03:00
dan.txt prompts : model agnostic DAN () 2023-05-11 18:10:19 +03:00
reason-act.txt do not force the prompt file to end with a new line () 2023-04-13 11:33:16 +02:00