New NovelAI/KoboldAI preset selection

This commit is contained in:
oobabooga 2023-01-23 20:44:27 -03:00
parent ce4756fb88
commit 6c40f7eeb4
10 changed files with 46 additions and 9 deletions

View File

@ -140,6 +140,8 @@ Out of memory errors? [Check this guide](https://github.com/oobabooga/text-gener
Inference settings presets can be created under `presets/` as text files. These files are detected automatically at startup. Inference settings presets can be created under `presets/` as text files. These files are detected automatically at startup.
By default, 10 presets by NovelAI and KoboldAI are included. These were selected out of a sample of 43 presets after applying a K-Means clustering algorithm and selecting the elements closest to the average of each cluster.
## System requirements ## System requirements
Check the [wiki](https://github.com/oobabooga/text-generation-webui/wiki/System-requirements) for some examples of VRAM and RAM usage in both GPU and CPU mode. Check the [wiki](https://github.com/oobabooga/text-generation-webui/wiki/System-requirements) for some examples of VRAM and RAM usage in both GPU and CPU mode.

View File

@ -1,7 +1,7 @@
do_sample=True, do_sample=True,
max_new_tokens=tokens, max_new_tokens=tokens,
top_p=1.0, top_p=0.5,
top_k=0, top_k=0,
temperature=0.7, temperature=0.7,
repetition_penalty=1.1, repetition_penalty=1.1,
typical_p=1.0, typical_p=0.19,

View File

@ -0,0 +1,7 @@
do_sample=True,
max_new_tokens=tokens,
top_p=1.0,
top_k=0,
temperature=0.66,
repetition_penalty=1.1,
typical_p=0.6,

View File

@ -0,0 +1,7 @@
do_sample=True,
max_new_tokens=tokens,
top_p=0.9,
top_k=100,
temperature=0.8,
repetition_penalty=1.15,
typical_p=1.0,

View File

@ -0,0 +1,7 @@
do_sample=True,
max_new_tokens=tokens,
top_p=1.0,
top_k=100,
temperature=2,
repetition_penalty=1,
typical_p=0.97,

View File

@ -1,7 +1,7 @@
do_sample=True, do_sample=True,
max_new_tokens=tokens, max_new_tokens=tokens,
top_p=1.0, top_p=0.98,
top_k=13, top_k=0,
temperature=1.33, temperature=0.63,
repetition_penalty=1.05, repetition_penalty=1.05,
typical_p=1.0, typical_p=1.0,

View File

@ -0,0 +1,7 @@
do_sample=True,
max_new_tokens=tokens,
top_p=0.85,
top_k=12,
temperature=2,
repetition_penalty=1.15,
typical_p=1.0,

View File

@ -2,6 +2,6 @@ do_sample=True,
max_new_tokens=tokens, max_new_tokens=tokens,
top_p=1.0, top_p=1.0,
top_k=100, top_k=100,
temperature=1.25, temperature=1.07,
repetition_penalty=1.05, repetition_penalty=1.05,
typical_p=1.0, typical_p=1.0,

View File

@ -0,0 +1,7 @@
do_sample=True,
max_new_tokens=tokens,
top_p=1.0,
top_k=0,
temperature=0.44,
repetition_penalty=1.15,
typical_p=1.0,

View File

@ -1,7 +1,7 @@
do_sample=True, do_sample=True,
max_new_tokens=tokens, max_new_tokens=tokens,
top_p=0.24, top_p=0.73,
top_k=85, top_k=0,
temperature=2.0, temperature=0.72,
repetition_penalty=1.1, repetition_penalty=1.1,
typical_p=1.0, typical_p=1.0,