oobabooga
|
36864cb3e8
|
Use Alpaca as the default instruction template
|
2023-08-29 13:06:25 -07:00 |
|
oobabooga
|
73d9befb65
|
Make "Show controls" customizable through settings.yaml
|
2023-08-16 07:04:18 -07:00 |
|
oobabooga
|
619cb4e78b
|
Add "save defaults to settings.yaml" button (#3574)
|
2023-08-14 11:46:07 -03:00 |
|
oobabooga
|
a1a9ec895d
|
Unify the 3 interface modes (#3554)
|
2023-08-13 01:12:15 -03:00 |
|
oobabooga
|
0af10ab49b
|
Add Classifier Free Guidance (CFG) for Transformers/ExLlama (#3325)
|
2023-08-06 17:22:48 -03:00 |
|
oobabooga
|
e931844fe2
|
Add auto_max_new_tokens parameter (#3419)
|
2023-08-02 14:52:20 -03:00 |
|
oobabooga
|
8d46a8c50a
|
Change the default chat style and the default preset
|
2023-08-01 09:35:17 -07:00 |
|
oobabooga
|
abea8d9ad3
|
Make settings-template.yaml more readable
|
2023-07-31 12:01:50 -07:00 |
|
oobabooga
|
28779cd959
|
Use dark theme by default
|
2023-07-25 20:11:57 -07:00 |
|
oobabooga
|
913e060348
|
Change the default preset to Divine Intellect
It seems to reduce hallucination while using instruction-tuned models.
|
2023-07-19 08:24:37 -07:00 |
|
oobabooga
|
8c1c2e0fae
|
Increase max_new_tokens upper limit
|
2023-07-17 17:08:22 -07:00 |
|
oobabooga
|
b1a6ea68dd
|
Disable "autoload the model" by default
|
2023-07-17 07:40:56 -07:00 |
|
oobabooga
|
c52290de50
|
ExLlama with long context (#2875)
|
2023-06-25 22:49:26 -03:00 |
|
oobabooga
|
383c50f05b
|
Replace old presets with the results of Preset Arena (#2830)
|
2023-06-23 01:48:29 -03:00 |
|
oobabooga
|
3a5cfe96f0
|
Increase chat_prompt_size_max
|
2023-06-05 17:37:37 -03:00 |
|
oobabooga
|
19f78684e6
|
Add "Start reply with" feature to chat mode
|
2023-06-02 13:58:08 -03:00 |
|
oobabooga
|
00ebea0b2a
|
Use YAML for presets and settings
|
2023-05-28 22:34:12 -03:00 |
|