mirror of
https://github.com/ggerganov/llama.cpp.git
synced 2024-12-26 14:20:31 +01:00
f4d277ae17
* Add support for configs, add configurable prefixes / suffixes, deprecate instruct mode, add stop prompt * Add multiline mode, update text input. * bugfix * update implementation * typos * Change --multiline implementation to be toggled by EOF. * bugfix * default multiline mode * add more configs * update formating * update formatting * apply suggestions
22 lines
626 B
Plaintext
22 lines
626 B
Plaintext
--ctx_size 2048
|
|
--batch_size 16
|
|
--repeat_penalty 1.15
|
|
--temp 0.4
|
|
--top_k 30
|
|
--top_p 0.18
|
|
|
|
--interactive-first
|
|
--keep -1
|
|
|
|
--ins-prefix-bos
|
|
--ins-prefix "\n\nUser: "
|
|
--ins-suffix "\n\nAssistant: "
|
|
--reverse-prompt "User: "
|
|
|
|
-p "You are an AI language model designed to assist the User by answering their questions, offering advice, and engaging in casual conversation in a friendly, helpful, and informative manner. You respond clearly, coherently, and you consider the conversation history.
|
|
|
|
User: Hey, how's it going?
|
|
|
|
Assistant: Hey there! I'm doing great, thank you. What can I help you with today? Let's have a fun chat!"
|
|
|