This website requires JavaScript.
Explore
Help
Register
Sign In
Mirrors
/
llama.cpp
Watch
1
Star
0
Fork
0
You've already forked llama.cpp
mirror of
https://github.com/ggerganov/llama.cpp.git
synced
2025-01-10 12:30:50 +01:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
llama.cpp
/
prompts
History
khimaros
6daa09d879
examples : read chat prompts from a template file (
#1196
)
2023-05-03 20:58:11 +03:00
..
alpaca.txt
Revert "main : alternative instruct mode (Vicuna support, etc.) (
#863
)" (
#982
)
2023-04-14 22:58:43 +03:00
chat-with-bob.txt
Revert "main : alternative instruct mode (Vicuna support, etc.) (
#863
)" (
#982
)
2023-04-14 22:58:43 +03:00
chat-with-vicuna-v0.txt
examples : read chat prompts from a template file (
#1196
)
2023-05-03 20:58:11 +03:00
chat-with-vicuna-v1.txt
examples : read chat prompts from a template file (
#1196
)
2023-05-03 20:58:11 +03:00
chat.txt
examples : read chat prompts from a template file (
#1196
)
2023-05-03 20:58:11 +03:00
dan.txt
examples : various prompt and example fixes (
#1298
)
2023-05-03 18:26:47 +03:00
reason-act.txt
do not force the prompt file to end with a new line (
#908
)
2023-04-13 11:33:16 +02:00