matatonic
|
32e7cbb635
|
More models: +StableBeluga2 (#3415)
|
2023-08-03 16:02:54 -03:00 |
|
CrazyShipOne
|
40038fdb82
|
add chat instruction config for BaiChuan model (#3332)
|
2023-08-01 22:25:20 -03:00 |
|
oobabooga
|
c8a59d79be
|
Add a template for NewHope
|
2023-08-01 13:27:29 -07:00 |
|
oobabooga
|
193c6be39c
|
Add missing \n to llama-v2 template context
|
2023-07-26 08:26:56 -07:00 |
|
oobabooga
|
8ec225f245
|
Add EOS/BOS tokens to Llama-2 template
Following this comment:
https://github.com/ggerganov/llama.cpp/issues/2262#issuecomment-1641063329
|
2023-07-18 15:35:27 -07:00 |
|
oobabooga
|
e0631e309f
|
Create instruction template for Llama-v2 (#3194)
|
2023-07-18 17:19:18 -03:00 |
|
oobabooga
|
656b457795
|
Add Airoboros-v1.2 template
|
2023-07-17 07:27:42 -07:00 |
|
matatonic
|
68ae5d8262
|
more models: +orca_mini (#2859)
|
2023-06-25 01:54:53 -03:00 |
|
matatonic
|
d94ea31d54
|
more models. +minotaur 8k (#2806)
|
2023-06-21 21:05:08 -03:00 |
|
matatonic
|
90be1d9fe1
|
More models (match more) & templates (starchat-beta, tulu) (#2790)
|
2023-06-21 12:30:44 -03:00 |
|
matatonic
|
2220b78e7a
|
models/config.yaml: +alpacino, +alpasta, +hippogriff, +gpt4all-snoozy, +lazarus, +based, -airoboros 4k (#2580)
|
2023-06-17 19:14:25 -03:00 |
|
FartyPants
|
ac40c59ac3
|
Added Guanaco-QLoRA to Instruct character (#2574)
|
2023-06-08 12:24:32 -03:00 |
|
oobabooga
|
f344ccdddb
|
Add a template for bluemoon
|
2023-06-01 14:42:12 -03:00 |
|
Carl Kenner
|
a9733d4a99
|
Metharme context fix (#2153)
|
2023-05-19 11:46:13 -03:00 |
|
Carl Kenner
|
c86231377b
|
Wizard Mega, Ziya, KoAlpaca, OpenBuddy, Chinese-Vicuna, Vigogne, Bactrian, H2O support, fix Baize (#2159)
|
2023-05-19 11:42:41 -03:00 |
|
matatonic
|
309b72e549
|
[extension/openai] add edits & image endpoints & fix prompt return in non --chat modes (#1935)
|
2023-05-11 11:06:39 -03:00 |
|
oobabooga
|
b7a589afc8
|
Improve the Metharme prompt
|
2023-05-10 16:09:32 -03:00 |
|
oobabooga
|
bdf1274b5d
|
Remove duplicate code
|
2023-05-10 01:34:04 -03:00 |
|
oobabooga
|
ba445cf59f
|
Fix some galactica templates
|
2023-05-09 22:58:59 -03:00 |
|
minipasila
|
334486f527
|
Added instruct-following template for Metharme (#1679)
|
2023-05-09 22:29:22 -03:00 |
|
Carl Kenner
|
814f754451
|
Support for MPT, INCITE, WizardLM, StableLM, Galactica, Vicuna, Guanaco, and Baize instruction following (#1596)
|
2023-05-09 20:37:31 -03:00 |
|
Wojtab
|
e9e75a9ec7
|
Generalize multimodality (llava/minigpt4 7b and 13b now supported) (#1741)
|
2023-05-09 20:18:02 -03:00 |
|
oobabooga
|
00e333d790
|
Add MOSS support
|
2023-05-04 23:20:34 -03:00 |
|
oobabooga
|
0e6d17304a
|
Clearer syntax for instruction-following characters
|
2023-05-03 22:50:39 -03:00 |
|
oobabooga
|
91745f63c3
|
Use Vicuna-v0 by default for Vicuna models
|
2023-04-26 17:45:38 -03:00 |
|
oobabooga
|
93e5c066ae
|
Update RWKV Raven template
|
2023-04-26 17:31:03 -03:00 |
|
oobabooga
|
1d8b8222e9
|
Revert #1579, apply the proper fix
Apparently models dislike trailing spaces.
|
2023-04-26 16:47:50 -03:00 |
|
TiagoGF
|
a941c19337
|
Fixing Vicuna text generation (#1579)
|
2023-04-26 16:20:27 -03:00 |
|
oobabooga
|
a777c058af
|
Precise prompts for instruct mode
|
2023-04-26 03:21:53 -03:00 |
|
Wojtab
|
12212cf6be
|
LLaVA support (#1487)
|
2023-04-23 20:32:22 -03:00 |
|
Forkoz
|
c6fe1ced01
|
Add ChatGLM support (#1256)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2023-04-16 19:15:03 -03:00 |
|
oobabooga
|
cb95a2432c
|
Add Koala support
|
2023-04-16 14:41:06 -03:00 |
|
OWKenobi
|
310bf46a94
|
Instruction Character Vicuna, Instruction Mode Bugfix (#838)
|
2023-04-06 17:40:44 -03:00 |
|
oobabooga
|
e722c240af
|
Add Instruct mode
|
2023-04-05 13:54:50 -03:00 |
|