oobabooga
|
736fe4aa3e
|
Fix server refusing to close on Ctrl+C
|
2023-12-12 12:27:40 -08:00 |
|
oobabooga
|
39d2fe1ed9
|
Jinja templates for Instruct and Chat (#4874)
|
2023-12-12 17:23:14 -03:00 |
|
oobabooga
|
c21a9668a5
|
Lint
|
2023-12-04 21:17:05 -08:00 |
|
erew123
|
f786aa3caa
|
Clean-up Ctrl+C Shutdown (#4802)
|
2023-12-05 02:16:16 -03:00 |
|
oobabooga
|
3f993280e4
|
Minor changes
|
2023-12-04 07:27:44 -08:00 |
|
Song Fuchang
|
0bfd5090be
|
Import accelerate very early to make Intel GPU happy (#4704)
|
2023-12-03 22:51:18 -03:00 |
|
oobabooga
|
ef6feedeb2
|
Add --nowebui flag for pure API mode (#4651)
|
2023-11-18 23:38:39 -03:00 |
|
oobabooga
|
4aabff3728
|
Remove old API, launch OpenAI API with --api
|
2023-11-10 06:39:08 -08:00 |
|
oobabooga
|
2358706453
|
Add /v1/internal/model/load endpoint (tentative)
|
2023-11-07 20:58:06 -08:00 |
|
oobabooga
|
fae8062d39
|
Bump to latest gradio (3.47) (#4258)
|
2023-10-10 22:20:49 -03:00 |
|
oobabooga
|
b973b91d73
|
Automatically filter by loader (closes #4072)
|
2023-09-25 10:28:35 -07:00 |
|
oobabooga
|
8ab3eca9ec
|
Add a warning for outdated installations
|
2023-09-22 09:35:19 -07:00 |
|
oobabooga
|
9b7646140c
|
Trim model path if using absolute path
|
2023-09-19 13:51:57 -07:00 |
|
oobabooga
|
df123a20fc
|
Prevent extra keys from being saved to settings.yaml
|
2023-09-11 20:13:10 -07:00 |
|
oobabooga
|
9331ab4798
|
Read GGUF metadata (#3873)
|
2023-09-11 18:49:30 -03:00 |
|
oobabooga
|
ed86878f02
|
Remove GGML support
|
2023-09-11 07:44:00 -07:00 |
|
oobabooga
|
4affa08821
|
Do not impose instruct mode while loading models
|
2023-09-02 11:31:33 -07:00 |
|
oobabooga
|
5c7d8bfdfd
|
Detect CodeLlama settings
|
2023-08-25 07:06:57 -07:00 |
|
oobabooga
|
73d9befb65
|
Make "Show controls" customizable through settings.yaml
|
2023-08-16 07:04:18 -07:00 |
|
oobabooga
|
619cb4e78b
|
Add "save defaults to settings.yaml" button (#3574)
|
2023-08-14 11:46:07 -03:00 |
|
oobabooga
|
a1a9ec895d
|
Unify the 3 interface modes (#3554)
|
2023-08-13 01:12:15 -03:00 |
|
oobabooga
|
6d354bb50b
|
Allow the webui to do multiple tasks simultaneously
|
2023-08-07 23:57:25 -03:00 |
|
oobabooga
|
bbe4a29a25
|
Add back dark theme code
|
2023-08-07 23:03:09 -03:00 |
|
oobabooga
|
65aa11890f
|
Refactor everything (#3481)
|
2023-08-06 21:49:27 -03:00 |
|
oobabooga
|
0af10ab49b
|
Add Classifier Free Guidance (CFG) for Transformers/ExLlama (#3325)
|
2023-08-06 17:22:48 -03:00 |
|
oobabooga
|
8df3cdfd51
|
Add SSL certificate support (#3453)
|
2023-08-04 13:57:31 -03:00 |
|
missionfloyd
|
2336b75d92
|
Remove unnecessary chat.js (#3445)
|
2023-08-04 01:58:37 -03:00 |
|
oobabooga
|
1839dff763
|
Use Esc to Stop the generation
|
2023-08-03 08:13:17 -07:00 |
|
oobabooga
|
3e70bce576
|
Properly format exceptions in the UI
|
2023-08-03 06:57:21 -07:00 |
|
oobabooga
|
3390196a14
|
Add some javascript alerts for confirmations
|
2023-08-02 22:15:20 -07:00 |
|
oobabooga
|
6bf9e855f8
|
Minor change
|
2023-08-02 21:41:38 -07:00 |
|
oobabooga
|
32c564509e
|
Fix loading session in chat mode
|
2023-08-02 21:13:16 -07:00 |
|
oobabooga
|
4b6c1d3f08
|
CSS change
|
2023-08-02 20:20:23 -07:00 |
|
oobabooga
|
0e8f9354b5
|
Add direct download for session/chat history JSONs
|
2023-08-02 19:43:39 -07:00 |
|
oobabooga
|
e931844fe2
|
Add auto_max_new_tokens parameter (#3419)
|
2023-08-02 14:52:20 -03:00 |
|
oobabooga
|
0d9932815c
|
Improve TheEncrypted777 on mobile devices
|
2023-08-02 09:15:54 -07:00 |
|
Pete
|
6afc1a193b
|
Add a scrollbar to notebook/default, improve chat scrollbar style (#3403)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2023-08-02 12:02:36 -03:00 |
|
oobabooga
|
b53ed70a70
|
Make llamacpp_HF 6x faster
|
2023-08-01 13:18:20 -07:00 |
|
oobabooga
|
959feba602
|
When saving model settings, only save the settings for the current loader
|
2023-08-01 06:10:09 -07:00 |
|
oobabooga
|
ebb4f22028
|
Change a comment
|
2023-07-31 20:06:10 -07:00 |
|
oobabooga
|
8e2217a029
|
Minor changes to the Parameters tab
|
2023-07-31 19:55:11 -07:00 |
|
oobabooga
|
b2207f123b
|
Update docs
|
2023-07-31 19:20:48 -07:00 |
|
oobabooga
|
84297d05c4
|
Add a "Filter by loader" menu to the Parameters tab
|
2023-07-31 19:09:02 -07:00 |
|
oobabooga
|
e6be25ea11
|
Fix a regression
|
2023-07-30 18:12:30 -07:00 |
|
oobabooga
|
5ca37765d3
|
Only replace {{user}} and {{char}} at generation time
|
2023-07-30 11:42:30 -07:00 |
|
oobabooga
|
6e16af34fd
|
Save uploaded characters as yaml
Also allow yaml characters to be uploaded directly
|
2023-07-30 11:25:38 -07:00 |
|
oobabooga
|
ed80a2e7db
|
Reorder llama.cpp params
|
2023-07-25 20:45:20 -07:00 |
|
oobabooga
|
0e8782df03
|
Set instruction template when switching from default/notebook to chat
|
2023-07-25 20:37:01 -07:00 |
|
oobabooga
|
1b89c304ad
|
Update README
|
2023-07-25 15:46:12 -07:00 |
|
oobabooga
|
75c2dd38cf
|
Remove flexgen support
|
2023-07-25 15:15:29 -07:00 |
|