mirror of
https://github.com/oobabooga/text-generation-webui.git
synced 2024-11-22 08:07:56 +01:00
New NovelAI/KoboldAI preset selection
This commit is contained in:
parent
ce4756fb88
commit
6c40f7eeb4
@ -140,6 +140,8 @@ Out of memory errors? [Check this guide](https://github.com/oobabooga/text-gener
|
||||
|
||||
Inference settings presets can be created under `presets/` as text files. These files are detected automatically at startup.
|
||||
|
||||
By default, 10 presets by NovelAI and KoboldAI are included. These were selected out of a sample of 43 presets after applying a K-Means clustering algorithm and selecting the elements closest to the average of each cluster.
|
||||
|
||||
## System requirements
|
||||
|
||||
Check the [wiki](https://github.com/oobabooga/text-generation-webui/wiki/System-requirements) for some examples of VRAM and RAM usage in both GPU and CPU mode.
|
||||
|
@ -1,7 +1,7 @@
|
||||
do_sample=True,
|
||||
max_new_tokens=tokens,
|
||||
top_p=1.0,
|
||||
top_p=0.5,
|
||||
top_k=0,
|
||||
temperature=0.7,
|
||||
repetition_penalty=1.1,
|
||||
typical_p=1.0,
|
||||
typical_p=0.19,
|
7
presets/Kobold-Liminal Drift.txt
Normal file
7
presets/Kobold-Liminal Drift.txt
Normal file
@ -0,0 +1,7 @@
|
||||
do_sample=True,
|
||||
max_new_tokens=tokens,
|
||||
top_p=1.0,
|
||||
top_k=0,
|
||||
temperature=0.66,
|
||||
repetition_penalty=1.1,
|
||||
typical_p=0.6,
|
7
presets/NovelAI-Best Guess.txt
Normal file
7
presets/NovelAI-Best Guess.txt
Normal file
@ -0,0 +1,7 @@
|
||||
do_sample=True,
|
||||
max_new_tokens=tokens,
|
||||
top_p=0.9,
|
||||
top_k=100,
|
||||
temperature=0.8,
|
||||
repetition_penalty=1.15,
|
||||
typical_p=1.0,
|
7
presets/NovelAI-Decadence.txt
Normal file
7
presets/NovelAI-Decadence.txt
Normal file
@ -0,0 +1,7 @@
|
||||
do_sample=True,
|
||||
max_new_tokens=tokens,
|
||||
top_p=1.0,
|
||||
top_k=100,
|
||||
temperature=2,
|
||||
repetition_penalty=1,
|
||||
typical_p=0.97,
|
@ -1,7 +1,7 @@
|
||||
do_sample=True,
|
||||
max_new_tokens=tokens,
|
||||
top_p=1.0,
|
||||
top_k=13,
|
||||
temperature=1.33,
|
||||
top_p=0.98,
|
||||
top_k=0,
|
||||
temperature=0.63,
|
||||
repetition_penalty=1.05,
|
||||
typical_p=1.0,
|
7
presets/NovelAI-Lycaenidae.txt
Normal file
7
presets/NovelAI-Lycaenidae.txt
Normal file
@ -0,0 +1,7 @@
|
||||
do_sample=True,
|
||||
max_new_tokens=tokens,
|
||||
top_p=0.85,
|
||||
top_k=12,
|
||||
temperature=2,
|
||||
repetition_penalty=1.15,
|
||||
typical_p=1.0,
|
@ -2,6 +2,6 @@ do_sample=True,
|
||||
max_new_tokens=tokens,
|
||||
top_p=1.0,
|
||||
top_k=100,
|
||||
temperature=1.25,
|
||||
temperature=1.07,
|
||||
repetition_penalty=1.05,
|
||||
typical_p=1.0,
|
7
presets/NovelAI-Pleasing Results.txt
Normal file
7
presets/NovelAI-Pleasing Results.txt
Normal file
@ -0,0 +1,7 @@
|
||||
do_sample=True,
|
||||
max_new_tokens=tokens,
|
||||
top_p=1.0,
|
||||
top_k=0,
|
||||
temperature=0.44,
|
||||
repetition_penalty=1.15,
|
||||
typical_p=1.0,
|
@ -1,7 +1,7 @@
|
||||
do_sample=True,
|
||||
max_new_tokens=tokens,
|
||||
top_p=0.24,
|
||||
top_k=85,
|
||||
temperature=2.0,
|
||||
top_p=0.73,
|
||||
top_k=0,
|
||||
temperature=0.72,
|
||||
repetition_penalty=1.1,
|
||||
typical_p=1.0,
|
Loading…
Reference in New Issue
Block a user