Commit Graph

581 Commits

Author SHA1 Message Date
Gabriel Pena
eedb3bf023
Add low vram mode on llama cpp (#3076) 2023-07-12 11:05:13 -03:00
Axiom Wolf
d986c17c52
Chat history download creates more detailed file names (#3051) 2023-07-12 00:10:36 -03:00
Salvador E. Tropea
324e45b848
[Fixed] wbits and groupsize values from model not shown (#2977) 2023-07-11 23:27:38 -03:00
oobabooga
bfafd07f44 Change a message 2023-07-11 18:29:20 -07:00
micsthepick
3708de2b1f
respect model dir for downloads (#3077) (#3079) 2023-07-11 18:55:46 -03:00
oobabooga
9aee1064a3 Block a cloudfare request 2023-07-06 22:24:52 -07:00
oobabooga
40c5722499
Fix #2998 2023-07-04 11:35:25 -03:00
oobabooga
55457549cd Add information about presets to the UI 2023-07-03 22:39:01 -07:00
Panchovix
10c8c197bf
Add Support for Static NTK RoPE scaling for exllama/exllama_hf (#2955) 2023-07-04 01:13:16 -03:00
FartyPants
eb6112d5a2
Update server.py - clear LORA after reload (#2952) 2023-07-04 00:13:38 -03:00
oobabooga
4b1804a438
Implement sessions + add basic multi-user support (#2991) 2023-07-04 00:03:30 -03:00
missionfloyd
ac0f96e785
Some more character import tweaks. (#2921) 2023-06-29 14:56:25 -03:00
oobabooga
5d2a8b31be Improve Parameters tab UI 2023-06-29 14:33:47 -03:00
oobabooga
3443219cbc
Add repetition penalty range parameter to transformers (#2916) 2023-06-29 13:40:13 -03:00
oobabooga
22d455b072 Add LoRA support to ExLlama_HF 2023-06-26 00:10:33 -03:00
oobabooga
b7c627f9a0 Set UI defaults 2023-06-25 22:55:43 -03:00
oobabooga
c52290de50
ExLlama with long context (#2875) 2023-06-25 22:49:26 -03:00
oobabooga
f0fcd1f697 Sort some imports 2023-06-25 01:44:36 -03:00
oobabooga
e6e5f546b8 Reorganize Chat settings tab 2023-06-25 01:10:20 -03:00
jllllll
bef67af23c
Use pre-compiled python module for ExLlama (#2770) 2023-06-24 20:24:17 -03:00
missionfloyd
51a388fa34
Organize chat history/character import menu (#2845)
* Organize character import menu

* Move Chat history upload/download labels
2023-06-24 09:55:02 -03:00
oobabooga
3ae9af01aa Add --no_use_cuda_fp16 param for AutoGPTQ 2023-06-23 12:22:56 -03:00
LarryVRH
580c1ee748
Implement a demo HF wrapper for exllama to utilize existing HF transformers decoding. (#2777) 2023-06-21 15:31:42 -03:00
Morgan Schweers
447569e31a
Add a download progress bar to the web UI. (#2472)
* Show download progress on the model screen.

* In case of error, mark as done to clear progress bar.

* Increase the iteration block size to reduce overhead.
2023-06-20 22:59:14 -03:00
oobabooga
09c781b16f Add modules/block_requests.py
This has become unnecessary, but it could be useful in the future
for other libraries.
2023-06-18 16:31:14 -03:00
oobabooga
44f28830d1 Chat CSS: fix ul, li, pre styles + remove redefinitions 2023-06-18 15:20:51 -03:00
oobabooga
239b11c94b Minor bug fixes 2023-06-17 17:57:56 -03:00
oobabooga
1e400218e9 Fix a typo 2023-06-16 21:01:57 -03:00
oobabooga
5f392122fd Add gpu_split param to ExLlama
Adapted from code created by Ph0rk0z. Thank you Ph0rk0z.
2023-06-16 20:49:36 -03:00
oobabooga
83be8eacf0 Minor fix 2023-06-16 20:38:32 -03:00
oobabooga
9f40032d32
Add ExLlama support (#2444) 2023-06-16 20:35:38 -03:00
oobabooga
dea43685b0 Add some clarifications 2023-06-16 19:10:53 -03:00
oobabooga
7ef6a50e84
Reorganize model loading UI completely (#2720) 2023-06-16 19:00:37 -03:00
Tom Jobbins
646b0c889f
AutoGPTQ: Add UI and command line support for disabling fused attention and fused MLP (#2648) 2023-06-15 23:59:54 -03:00
oobabooga
474dc7355a Allow API requests to use parameter presets 2023-06-14 11:32:20 -03:00
FartyPants
9f150aedc3
A small UI change in Models menu (#2640) 2023-06-12 01:24:44 -03:00
oobabooga
da5d9a28d8 Fix tabbed extensions showing up at the bottom of the UI 2023-06-11 21:20:51 -03:00
oobabooga
ae5e2b3470 Reorganize a bit 2023-06-11 19:50:20 -03:00
oobabooga
f4defde752 Add a menu for installing extensions 2023-06-11 17:11:06 -03:00
oobabooga
8e73806b20 Improve "Interface mode" appearance 2023-06-11 15:29:45 -03:00
oobabooga
ac122832f7 Make dropdown menus more similar to automatic1111 2023-06-11 14:20:16 -03:00
oobabooga
6133675e0f
Add menus for saving presets/characters/instruction templates/prompts (#2621) 2023-06-11 12:19:18 -03:00
brandonj60
b04e18d10c
Add Mirostat v2 sampling to transformer models (#2571) 2023-06-09 21:26:31 -03:00
oobabooga
eb2601a8c3 Reorganize Parameters tab 2023-06-06 14:51:02 -03:00
oobabooga
f06a1387f0 Reorganize Models tab 2023-06-06 07:58:07 -03:00
oobabooga
d49d299b67 Change a message 2023-06-06 07:54:56 -03:00
oobabooga
7ed1e35fbf Reorganize Parameters tab in chat mode 2023-06-06 07:46:25 -03:00
oobabooga
00b94847da Remove softprompt support 2023-06-06 07:42:23 -03:00
oobabooga
f276d88546 Use AutoGPTQ by default for GPTQ models 2023-06-05 15:41:48 -03:00
oobabooga
6a75bda419 Assign some 4096 seq lengths 2023-06-05 12:07:52 -03:00