Commit Graph

2461 Commits

Author SHA1 Message Date
oobabooga
6d354bb50b
Allow the webui to do multiple tasks simultaneously 2023-08-07 23:57:25 -03:00
oobabooga
584dd33424
Fix missing example_dialogue when uploading characters 2023-08-07 23:44:59 -03:00
oobabooga
bbe4a29a25
Add back dark theme code 2023-08-07 23:03:09 -03:00
oobabooga
2d0634cd07 Bump transformers commit for positive prompts 2023-08-07 08:57:19 -07:00
Sam
3b27404865
Make dockerfile respect specified cuda version (#3474) 2023-08-07 10:19:16 -03:00
oobabooga
412f6ff9d3 Change alpha_value maximum and step 2023-08-07 06:08:51 -07:00
oobabooga
a373c96d59 Fix a bug in modules/shared.py 2023-08-06 20:36:35 -07:00
jllllll
2cf64474f2
Use chat_instruct_command in API (#3482) 2023-08-06 23:46:25 -03:00
oobabooga
3d48933f27 Remove ancient deprecation warnings 2023-08-06 18:58:59 -07:00
oobabooga
c237ce607e Move characters/instruction-following to instruction-templates 2023-08-06 17:50:32 -07:00
oobabooga
65aa11890f
Refactor everything (#3481) 2023-08-06 21:49:27 -03:00
oobabooga
d4b851bdc8 Credit turboderp 2023-08-06 13:43:15 -07:00
oobabooga
0af10ab49b
Add Classifier Free Guidance (CFG) for Transformers/ExLlama (#3325) 2023-08-06 17:22:48 -03:00
missionfloyd
5134878344
Fix chat message order (#3461) 2023-08-05 13:53:54 -03:00
jllllll
44f31731af
Create logs dir if missing when saving history (#3462) 2023-08-05 13:47:16 -03:00
jllllll
5ee95d126c
Bump exllama wheels to 0.0.10 (#3467) 2023-08-05 13:46:14 -03:00
Forkoz
9dcb37e8d4
Fix: Mirostat fails on models split across multiple GPUs 2023-08-05 13:45:47 -03:00
SodaPrettyCold
23055b21ee
[Bug fix] Remove html tags form the Prompt sent to Stable Diffusion (#3151) 2023-08-04 20:20:28 -03:00
jllllll
6e30f76ba5
Bump bitsandbytes to 0.41.1 (#3457) 2023-08-04 19:28:59 -03:00
oobabooga
8df3cdfd51
Add SSL certificate support (#3453) 2023-08-04 13:57:31 -03:00
oobabooga
ed57a79c6e
Add back silero preview by @missionfloyd (#3446) 2023-08-04 02:29:14 -03:00
missionfloyd
2336b75d92
Remove unnecessary chat.js (#3445) 2023-08-04 01:58:37 -03:00
oobabooga
4b3384e353 Handle unfinished lists during markdown streaming 2023-08-03 17:15:18 -07:00
Pete
f4005164f4
Fix llama.cpp truncation (#3400)
---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-08-03 20:01:15 -03:00
oobabooga
4e6dc6d99d Add Contributing guidelines 2023-08-03 14:40:28 -07:00
matatonic
8f98268252
extensions/openai: include content-length for json replies (#3416) 2023-08-03 16:10:49 -03:00
matatonic
32e7cbb635
More models: +StableBeluga2 (#3415) 2023-08-03 16:02:54 -03:00
Paul DeCarlo
f61573bbde
Add standalone Dockerfile for NVIDIA Jetson (#3336) 2023-08-03 15:57:33 -03:00
rafa-9
d578baeb2c
Use character settings from API properties if present (#3428) 2023-08-03 15:56:40 -03:00
oobabooga
d93087adc3 Merge remote-tracking branch 'refs/remotes/origin/main' 2023-08-03 08:14:10 -07:00
oobabooga
1839dff763 Use Esc to Stop the generation 2023-08-03 08:13:17 -07:00
oobabooga
87dab03dc0
Add the --cpu option for llama.cpp to prevent CUDA from being used (#3432) 2023-08-03 11:00:36 -03:00
oobabooga
3e70bce576 Properly format exceptions in the UI 2023-08-03 06:57:21 -07:00
oobabooga
3390196a14 Add some javascript alerts for confirmations 2023-08-02 22:15:20 -07:00
oobabooga
e074538b58 Revert "Make long_replies ban the eos token as well"
This reverts commit 6c521ce967.
2023-08-02 21:45:10 -07:00
oobabooga
6bf9e855f8 Minor change 2023-08-02 21:41:38 -07:00
oobabooga
32c564509e Fix loading session in chat mode 2023-08-02 21:13:16 -07:00
oobabooga
4b6c1d3f08 CSS change 2023-08-02 20:20:23 -07:00
oobabooga
0e8f9354b5 Add direct download for session/chat history JSONs 2023-08-02 19:43:39 -07:00
oobabooga
32a2bbee4a Implement auto_max_new_tokens for ExLlama 2023-08-02 11:03:56 -07:00
oobabooga
e931844fe2
Add auto_max_new_tokens parameter (#3419) 2023-08-02 14:52:20 -03:00
oobabooga
0d9932815c Improve TheEncrypted777 on mobile devices 2023-08-02 09:15:54 -07:00
Pete
6afc1a193b
Add a scrollbar to notebook/default, improve chat scrollbar style (#3403)
---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-08-02 12:02:36 -03:00
oobabooga
6c521ce967 Make long_replies ban the eos token as well 2023-08-01 18:47:49 -07:00
matatonic
9ae0eab989
extensions/openai: +Array input (batched) , +Fixes (#3309) 2023-08-01 22:26:00 -03:00
CrazyShipOne
40038fdb82
add chat instruction config for BaiChuan model (#3332) 2023-08-01 22:25:20 -03:00
oobabooga
c8a59d79be Add a template for NewHope 2023-08-01 13:27:29 -07:00
oobabooga
b53ed70a70 Make llamacpp_HF 6x faster 2023-08-01 13:18:20 -07:00
oobabooga
385229313f Increase the interface area a bit 2023-08-01 09:41:57 -07:00
oobabooga
8d46a8c50a Change the default chat style and the default preset 2023-08-01 09:35:17 -07:00