1
0
mirror of https://github.com/ggerganov/llama.cpp.git synced 2025-01-24 02:19:18 +01:00
llama.cpp/prompts
2023-09-14 12:32:10 -04:00
..
alpaca.txt Revert "main : alternative instruct mode (Vicuna support, etc.) ()" () 2023-04-14 22:58:43 +03:00
chat-with-baichuan.txt feature : support Baichuan serial models () 2023-09-14 12:32:10 -04:00
chat-with-bob.txt Revert "main : alternative instruct mode (Vicuna support, etc.) ()" () 2023-04-14 22:58:43 +03:00
chat-with-vicuna-v0.txt examples : read chat prompts from a template file () 2023-05-03 20:58:11 +03:00
chat-with-vicuna-v1.txt examples : read chat prompts from a template file () 2023-05-03 20:58:11 +03:00
chat.txt examples : read chat prompts from a template file () 2023-05-03 20:58:11 +03:00
dan-modified.txt prompts : model agnostic DAN () 2023-05-11 18:10:19 +03:00
dan.txt prompts : model agnostic DAN () 2023-05-11 18:10:19 +03:00
reason-act.txt do not force the prompt file to end with a new line () 2023-04-13 11:33:16 +02:00