jllllll
|
9e17325207
|
Add CMD_FLAGS.txt functionality to WSL installer (#119)
|
2023-08-05 10:26:24 -03:00 |
|
SodaPrettyCold
|
23055b21ee
|
[Bug fix] Remove html tags form the Prompt sent to Stable Diffusion (#3151)
|
2023-08-04 20:20:28 -03:00 |
|
jllllll
|
6e30f76ba5
|
Bump bitsandbytes to 0.41.1 (#3457)
|
2023-08-04 19:28:59 -03:00 |
|
oobabooga
|
8df3cdfd51
|
Add SSL certificate support (#3453)
|
2023-08-04 13:57:31 -03:00 |
|
oobabooga
|
ed57a79c6e
|
Add back silero preview by @missionfloyd (#3446)
|
2023-08-04 02:29:14 -03:00 |
|
missionfloyd
|
2336b75d92
|
Remove unnecessary chat.js (#3445)
|
2023-08-04 01:58:37 -03:00 |
|
oobabooga
|
4b3384e353
|
Handle unfinished lists during markdown streaming
|
2023-08-03 17:15:18 -07:00 |
|
Pete
|
f4005164f4
|
Fix llama.cpp truncation (#3400)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2023-08-03 20:01:15 -03:00 |
|
oobabooga
|
4e6dc6d99d
|
Add Contributing guidelines
|
2023-08-03 14:40:28 -07:00 |
|
matatonic
|
8f98268252
|
extensions/openai: include content-length for json replies (#3416)
|
2023-08-03 16:10:49 -03:00 |
|
matatonic
|
32e7cbb635
|
More models: +StableBeluga2 (#3415)
|
2023-08-03 16:02:54 -03:00 |
|
Paul DeCarlo
|
f61573bbde
|
Add standalone Dockerfile for NVIDIA Jetson (#3336)
|
2023-08-03 15:57:33 -03:00 |
|
rafa-9
|
d578baeb2c
|
Use character settings from API properties if present (#3428)
|
2023-08-03 15:56:40 -03:00 |
|
oobabooga
|
601fc424cd
|
Several improvements (#117)
|
2023-08-03 14:39:46 -03:00 |
|
oobabooga
|
d93087adc3
|
Merge remote-tracking branch 'refs/remotes/origin/main'
|
2023-08-03 08:14:10 -07:00 |
|
oobabooga
|
1839dff763
|
Use Esc to Stop the generation
|
2023-08-03 08:13:17 -07:00 |
|
oobabooga
|
87dab03dc0
|
Add the --cpu option for llama.cpp to prevent CUDA from being used (#3432)
|
2023-08-03 11:00:36 -03:00 |
|
oobabooga
|
3e70bce576
|
Properly format exceptions in the UI
|
2023-08-03 06:57:21 -07:00 |
|
oobabooga
|
3390196a14
|
Add some javascript alerts for confirmations
|
2023-08-02 22:15:20 -07:00 |
|
oobabooga
|
e074538b58
|
Revert "Make long_replies ban the eos token as well"
This reverts commit 6c521ce967 .
|
2023-08-02 21:45:10 -07:00 |
|
oobabooga
|
6bf9e855f8
|
Minor change
|
2023-08-02 21:41:38 -07:00 |
|
oobabooga
|
32c564509e
|
Fix loading session in chat mode
|
2023-08-02 21:13:16 -07:00 |
|
oobabooga
|
4b6c1d3f08
|
CSS change
|
2023-08-02 20:20:23 -07:00 |
|
oobabooga
|
0e8f9354b5
|
Add direct download for session/chat history JSONs
|
2023-08-02 19:43:39 -07:00 |
|
jllllll
|
aca5679968
|
Properly fix broken gcc_linux-64 package (#115)
|
2023-08-02 23:39:07 -03:00 |
|
oobabooga
|
32a2bbee4a
|
Implement auto_max_new_tokens for ExLlama
|
2023-08-02 11:03:56 -07:00 |
|
oobabooga
|
e931844fe2
|
Add auto_max_new_tokens parameter (#3419)
|
2023-08-02 14:52:20 -03:00 |
|
oobabooga
|
0d9932815c
|
Improve TheEncrypted777 on mobile devices
|
2023-08-02 09:15:54 -07:00 |
|
Pete
|
6afc1a193b
|
Add a scrollbar to notebook/default, improve chat scrollbar style (#3403)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2023-08-02 12:02:36 -03:00 |
|
oobabooga
|
6c521ce967
|
Make long_replies ban the eos token as well
|
2023-08-01 18:47:49 -07:00 |
|
matatonic
|
9ae0eab989
|
extensions/openai: +Array input (batched) , +Fixes (#3309)
|
2023-08-01 22:26:00 -03:00 |
|
CrazyShipOne
|
40038fdb82
|
add chat instruction config for BaiChuan model (#3332)
|
2023-08-01 22:25:20 -03:00 |
|
oobabooga
|
c8a59d79be
|
Add a template for NewHope
|
2023-08-01 13:27:29 -07:00 |
|
oobabooga
|
b53ed70a70
|
Make llamacpp_HF 6x faster
|
2023-08-01 13:18:20 -07:00 |
|
oobabooga
|
385229313f
|
Increase the interface area a bit
|
2023-08-01 09:41:57 -07:00 |
|
oobabooga
|
8d46a8c50a
|
Change the default chat style and the default preset
|
2023-08-01 09:35:17 -07:00 |
|
oobabooga
|
9773534181
|
Update Chat-mode.md
|
2023-08-01 08:03:22 -07:00 |
|
oobabooga
|
959feba602
|
When saving model settings, only save the settings for the current loader
|
2023-08-01 06:10:09 -07:00 |
|
oobabooga
|
ebb4f22028
|
Change a comment
|
2023-07-31 20:06:10 -07:00 |
|
oobabooga
|
8e2217a029
|
Minor changes to the Parameters tab
|
2023-07-31 19:55:11 -07:00 |
|
oobabooga
|
b2207f123b
|
Update docs
|
2023-07-31 19:20:48 -07:00 |
|
oobabooga
|
f094330df0
|
When saving a preset, only save params that differ from the defaults
|
2023-07-31 19:13:29 -07:00 |
|
oobabooga
|
84297d05c4
|
Add a "Filter by loader" menu to the Parameters tab
|
2023-07-31 19:09:02 -07:00 |
|
oobabooga
|
abea8d9ad3
|
Make settings-template.yaml more readable
|
2023-07-31 12:01:50 -07:00 |
|
oobabooga
|
7de7b3d495
|
Fix newlines in exported character yamls
|
2023-07-31 10:46:02 -07:00 |
|
oobabooga
|
d06c34dea5
|
Add an extension that makes chat replies longer (#3363)
|
2023-07-31 13:34:41 -03:00 |
|
oobabooga
|
e6be25ea11
|
Fix a regression
|
2023-07-30 18:12:30 -07:00 |
|
oobabooga
|
5ca37765d3
|
Only replace {{user}} and {{char}} at generation time
|
2023-07-30 11:42:30 -07:00 |
|
oobabooga
|
6e16af34fd
|
Save uploaded characters as yaml
Also allow yaml characters to be uploaded directly
|
2023-07-30 11:25:38 -07:00 |
|
oobabooga
|
c25602eb65
|
Merge branch 'dev'
|
2023-07-30 08:47:50 -07:00 |
|