Commit Graph

2473 Commits

Author SHA1 Message Date
oobabooga
0e8f9354b5 Add direct download for session/chat history JSONs 2023-08-02 19:43:39 -07:00
oobabooga
32a2bbee4a Implement auto_max_new_tokens for ExLlama 2023-08-02 11:03:56 -07:00
oobabooga
e931844fe2
Add auto_max_new_tokens parameter (#3419) 2023-08-02 14:52:20 -03:00
oobabooga
0d9932815c Improve TheEncrypted777 on mobile devices 2023-08-02 09:15:54 -07:00
Pete
6afc1a193b
Add a scrollbar to notebook/default, improve chat scrollbar style (#3403)
---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-08-02 12:02:36 -03:00
oobabooga
6c521ce967 Make long_replies ban the eos token as well 2023-08-01 18:47:49 -07:00
matatonic
9ae0eab989
extensions/openai: +Array input (batched) , +Fixes (#3309) 2023-08-01 22:26:00 -03:00
CrazyShipOne
40038fdb82
add chat instruction config for BaiChuan model (#3332) 2023-08-01 22:25:20 -03:00
oobabooga
c8a59d79be Add a template for NewHope 2023-08-01 13:27:29 -07:00
oobabooga
b53ed70a70 Make llamacpp_HF 6x faster 2023-08-01 13:18:20 -07:00
oobabooga
385229313f Increase the interface area a bit 2023-08-01 09:41:57 -07:00
oobabooga
8d46a8c50a Change the default chat style and the default preset 2023-08-01 09:35:17 -07:00
oobabooga
9773534181 Update Chat-mode.md 2023-08-01 08:03:22 -07:00
oobabooga
959feba602 When saving model settings, only save the settings for the current loader 2023-08-01 06:10:09 -07:00
oobabooga
ebb4f22028 Change a comment 2023-07-31 20:06:10 -07:00
oobabooga
8e2217a029 Minor changes to the Parameters tab 2023-07-31 19:55:11 -07:00
oobabooga
b2207f123b Update docs 2023-07-31 19:20:48 -07:00
oobabooga
f094330df0 When saving a preset, only save params that differ from the defaults 2023-07-31 19:13:29 -07:00
oobabooga
84297d05c4 Add a "Filter by loader" menu to the Parameters tab 2023-07-31 19:09:02 -07:00
oobabooga
abea8d9ad3 Make settings-template.yaml more readable 2023-07-31 12:01:50 -07:00
oobabooga
7de7b3d495 Fix newlines in exported character yamls 2023-07-31 10:46:02 -07:00
oobabooga
d06c34dea5
Add an extension that makes chat replies longer (#3363) 2023-07-31 13:34:41 -03:00
oobabooga
e6be25ea11 Fix a regression 2023-07-30 18:12:30 -07:00
oobabooga
5ca37765d3 Only replace {{user}} and {{char}} at generation time 2023-07-30 11:42:30 -07:00
oobabooga
6e16af34fd Save uploaded characters as yaml
Also allow yaml characters to be uploaded directly
2023-07-30 11:25:38 -07:00
oobabooga
c25602eb65 Merge branch 'dev' 2023-07-30 08:47:50 -07:00
oobabooga
ca4188aabc Update the example extension 2023-07-29 18:57:22 -07:00
jllllll
c4e14a757c
Bump exllama module to 0.0.9 (#3338) 2023-07-29 22:16:23 -03:00
GuizzyQC
4b37a2b397
sd_api_pictures: Widen sliders for image size minimum and maximum (#3326) 2023-07-26 13:49:46 -03:00
oobabooga
d6314fd539 Change a comment 2023-07-26 09:38:45 -07:00
oobabooga
f24f87cfb0 Change a comment 2023-07-26 09:38:13 -07:00
oobabooga
de5de045e0 Set rms_norm_eps to 5e-6 for every llama-2 ggml model, not just 70b 2023-07-26 08:26:56 -07:00
oobabooga
193c6be39c Add missing \n to llama-v2 template context 2023-07-26 08:26:56 -07:00
oobabooga
ec68d5211e Set rms_norm_eps to 5e-6 for every llama-2 ggml model, not just 70b 2023-07-26 08:23:24 -07:00
oobabooga
a9e10753df Add missing \n to llama-v2 template context 2023-07-26 07:59:49 -07:00
oobabooga
b780d520d2 Add a link to the gradio docs 2023-07-26 07:49:42 -07:00
oobabooga
b553c33dd0 Add a link to the gradio docs 2023-07-26 07:49:22 -07:00
oobabooga
d94ba6e68b Define visible_text before applying chat_input extensions 2023-07-26 07:30:25 -07:00
oobabooga
b31321c779 Define visible_text before applying chat_input extensions 2023-07-26 07:27:14 -07:00
oobabooga
b17893a58f Revert "Add tensor split support for llama.cpp (#3171)"
This reverts commit 031fe7225e.
2023-07-26 07:06:01 -07:00
oobabooga
517d40cffe Update Extensions.md 2023-07-26 07:01:35 -07:00
oobabooga
b11f63cb18 update extensions docs 2023-07-26 07:00:33 -07:00
oobabooga
4a24849715 Revert changes 2023-07-25 21:09:32 -07:00
oobabooga
69f8b35bc9 Revert changes to README 2023-07-25 20:51:19 -07:00
oobabooga
ed80a2e7db Reorder llama.cpp params 2023-07-25 20:45:20 -07:00
oobabooga
0e8782df03 Set instruction template when switching from default/notebook to chat 2023-07-25 20:37:01 -07:00
oobabooga
28779cd959 Use dark theme by default 2023-07-25 20:11:57 -07:00
oobabooga
c2e0d46616 Add credits 2023-07-25 15:49:04 -07:00
oobabooga
1b89c304ad Update README 2023-07-25 15:46:12 -07:00
oobabooga
d3abe7caa8 Update llama.cpp.md 2023-07-25 15:33:16 -07:00