Honkware
|
0a6a498383
|
Load xgen tokenizer
|
2023-06-29 01:32:44 -05:00 |
|
oobabooga
|
c6cae106e7
|
Bump llama-cpp-python
|
2023-06-28 18:14:45 -03:00 |
|
oobabooga
|
20740ab16e
|
Revert "Fix exllama_hf gibbersh above 2048 context, and works >5000 context. (#2913)"
This reverts commit 37a16d23a7 .
|
2023-06-28 18:10:34 -03:00 |
|
jllllll
|
7b048dcf67
|
Bump exllama module version to 0.0.4 (#2915)
|
2023-06-28 18:09:58 -03:00 |
|
Panchovix
|
37a16d23a7
|
Fix exllama_hf gibbersh above 2048 context, and works >5000 context. (#2913)
|
2023-06-28 12:36:07 -03:00 |
|
oobabooga
|
63770c0643
|
Update docs/Extensions.md
|
2023-06-27 22:25:05 -03:00 |
|
matatonic
|
da0ea9e0f3
|
set +landmark, +superhot-8k to 8k length (#2903)
|
2023-06-27 22:05:52 -03:00 |
|
missionfloyd
|
5008daa0ff
|
Add exception handler to load_checkpoint() (#2904)
|
2023-06-27 22:00:29 -03:00 |
|
oobabooga
|
c95009d2bd
|
Merge remote-tracking branch 'refs/remotes/origin/main'
|
2023-06-27 18:48:17 -03:00 |
|
oobabooga
|
67a83f3ad9
|
Use DPM++ 2M Karras for Stable Diffusion
|
2023-06-27 18:47:35 -03:00 |
|
FartyPants
|
ab1998146b
|
Training update - backup the existing adapter before training on top of it (#2902)
|
2023-06-27 18:24:04 -03:00 |
|
Minecrafter20
|
40bbd53640
|
Add custom prompt format for SD API pictures (#1964)
|
2023-06-27 17:49:18 -03:00 |
|
missionfloyd
|
cb029cf65f
|
Get SD samplers from API (#2889)
|
2023-06-27 17:31:54 -03:00 |
|
GuizzyQC
|
d7a7f7896b
|
Add SD checkpoint selection in sd_api_pictures (#2872)
|
2023-06-27 17:29:27 -03:00 |
|
oobabooga
|
7611978f7b
|
Add Community section to README
|
2023-06-27 13:56:14 -03:00 |
|
oobabooga
|
22d455b072
|
Add LoRA support to ExLlama_HF
|
2023-06-26 00:10:33 -03:00 |
|
oobabooga
|
b7c627f9a0
|
Set UI defaults
|
2023-06-25 22:55:43 -03:00 |
|
oobabooga
|
c52290de50
|
ExLlama with long context (#2875)
|
2023-06-25 22:49:26 -03:00 |
|
oobabooga
|
9290c6236f
|
Keep ExLlama_HF if already selected
|
2023-06-25 19:06:28 -03:00 |
|
oobabooga
|
75fd763f99
|
Fix chat saving issue (closes #2863)
|
2023-06-25 18:14:57 -03:00 |
|
FartyPants
|
21c189112c
|
Several Training Enhancements (#2868)
|
2023-06-25 15:34:46 -03:00 |
|
oobabooga
|
95212edf1f
|
Update training.py
|
2023-06-25 12:13:15 -03:00 |
|
oobabooga
|
1f5ea451c9
|
Merge remote-tracking branch 'refs/remotes/origin/main'
|
2023-06-25 02:14:19 -03:00 |
|
oobabooga
|
f31281a8de
|
Fix loading instruction templates containing literal '\n'
|
2023-06-25 02:13:26 -03:00 |
|
matatonic
|
68ae5d8262
|
more models: +orca_mini (#2859)
|
2023-06-25 01:54:53 -03:00 |
|
oobabooga
|
f0fcd1f697
|
Sort some imports
|
2023-06-25 01:44:36 -03:00 |
|
oobabooga
|
365b672531
|
Minor change to prevent future bugs
|
2023-06-25 01:38:54 -03:00 |
|
oobabooga
|
e6e5f546b8
|
Reorganize Chat settings tab
|
2023-06-25 01:10:20 -03:00 |
|
matatonic
|
b45baeea41
|
extensions/openai: Major docs update, fix #2852 (critical bug), minor improvements (#2849)
|
2023-06-24 22:50:04 -03:00 |
|
oobabooga
|
ebfcfa41f2
|
Update ExLlama.md
|
2023-06-24 20:25:34 -03:00 |
|
jllllll
|
bef67af23c
|
Use pre-compiled python module for ExLlama (#2770)
|
2023-06-24 20:24:17 -03:00 |
|
oobabooga
|
a70a2ac3be
|
Update ExLlama.md
|
2023-06-24 20:23:01 -03:00 |
|
oobabooga
|
b071eb0d4b
|
Clean up the presets (#2854)
|
2023-06-24 18:41:17 -03:00 |
|
oobabooga
|
cec5fb0ef6
|
Failed attempt at evaluating exllama_hf perplexity
|
2023-06-24 12:02:25 -03:00 |
|
快乐的我531
|
e356f69b36
|
Make stop_everything work with non-streamed generation (#2848)
|
2023-06-24 11:19:16 -03:00 |
|
oobabooga
|
ec482f3dae
|
Apply input extensions after yielding *Is typing...*
|
2023-06-24 11:07:11 -03:00 |
|
oobabooga
|
3e80f2aceb
|
Apply the output extensions only once
Relevant for google translate, silero
|
2023-06-24 10:59:07 -03:00 |
|
rizerphe
|
77baf43f6d
|
Add CORS support to the API (#2718)
|
2023-06-24 10:16:06 -03:00 |
|
matatonic
|
8c36c19218
|
8k size only for minotaur-15B (#2815)
Co-authored-by: Matthew Ashton <mashton-gitlab@zhero.org>
|
2023-06-24 10:14:19 -03:00 |
|
Roman
|
38897fbd8a
|
fix: added model parameter check (#2829)
|
2023-06-24 10:09:34 -03:00 |
|
missionfloyd
|
51a388fa34
|
Organize chat history/character import menu (#2845)
* Organize character import menu
* Move Chat history upload/download labels
|
2023-06-24 09:55:02 -03:00 |
|
oobabooga
|
8bb3bb39b3
|
Implement stopping string search in string space (#2847)
|
2023-06-24 09:43:00 -03:00 |
|
oobabooga
|
0f9088f730
|
Update README
|
2023-06-23 12:24:43 -03:00 |
|
oobabooga
|
3ae9af01aa
|
Add --no_use_cuda_fp16 param for AutoGPTQ
|
2023-06-23 12:22:56 -03:00 |
|
Panchovix
|
5646690769
|
Fix some models not loading on exllama_hf (#2835)
|
2023-06-23 11:31:02 -03:00 |
|
oobabooga
|
383c50f05b
|
Replace old presets with the results of Preset Arena (#2830)
|
2023-06-23 01:48:29 -03:00 |
|
missionfloyd
|
aa1f1ef46a
|
Fix printing, take two. (#2810)
* Format chat for printing
* Better printing
|
2023-06-22 16:06:49 -03:00 |
|
Panchovix
|
b4a38c24b7
|
Fix Multi-GPU not working on exllama_hf (#2803)
|
2023-06-22 16:05:25 -03:00 |
|
matatonic
|
d94ea31d54
|
more models. +minotaur 8k (#2806)
|
2023-06-21 21:05:08 -03:00 |
|
LarryVRH
|
580c1ee748
|
Implement a demo HF wrapper for exllama to utilize existing HF transformers decoding. (#2777)
|
2023-06-21 15:31:42 -03:00 |
|