Commit Graph

2344 Commits

Author SHA1 Message Date
oobabooga
919a3cf9d0 Fix the gallery 2023-08-13 05:43:09 -07:00
oobabooga
a1a9ec895d
Unify the 3 interface modes (#3554) 2023-08-13 01:12:15 -03:00
cal066
bf70c19603
ctransformers: move thread and seed parameters (#3543) 2023-08-13 00:04:03 -03:00
jllllll
73421b1fed
Bump ctransformers wheel version (#3558) 2023-08-12 23:02:47 -03:00
oobabooga
0e05818266 Style changes 2023-08-11 16:35:57 -07:00
oobabooga
4c450e6b70
Update README.md 2023-08-11 15:50:16 -03:00
oobabooga
2f918ccf7c Remove unused parameter 2023-08-11 11:15:22 -07:00
oobabooga
28c8df337b Add repetition_penalty_range to ctransformers 2023-08-11 11:04:19 -07:00
cal066
7a4fcee069
Add ctransformers support (#3313)
---------

Co-authored-by: cal066 <cal066@users.noreply.github.com>
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
Co-authored-by: randoentity <137087500+randoentity@users.noreply.github.com>
2023-08-11 14:41:33 -03:00
oobabooga
8dbaa20ca8 Don't replace last reply with an empty message 2023-08-10 13:14:48 -07:00
oobabooga
0789554f65 Allow --lora to use an absolute path 2023-08-10 10:03:12 -07:00
oobabooga
3929971b66 Don't show oobabooga_llama-tokenizer in the model dropdown 2023-08-10 10:02:48 -07:00
Gennadij
e12a1852d9
Add Vicuna-v1.5 detection (#3524) 2023-08-10 13:42:24 -03:00
oobabooga
e3d2ddd170
Streamline GPTQ-for-LLaMa support (#3526 from jllllll/gptqllama) 2023-08-10 12:54:59 -03:00
oobabooga
c7f52bbdc1 Revert "Remove GPTQ-for-LLaMa monkey patch support"
This reverts commit e3d3565b2a.
2023-08-10 08:39:41 -07:00
oobabooga
16e2b117b4 Minor doc change 2023-08-10 08:38:10 -07:00
jllllll
d6765bebc4
Update installation documentation 2023-08-10 00:53:48 -05:00
jllllll
d7ee4c2386
Remove unused import 2023-08-10 00:10:14 -05:00
jllllll
e3d3565b2a
Remove GPTQ-for-LLaMa monkey patch support
AutoGPTQ will be the preferred GPTQ LoRa loader in the future.
2023-08-09 23:59:04 -05:00
jllllll
bee73cedbd
Streamline GPTQ-for-LLaMa support 2023-08-09 23:42:34 -05:00
oobabooga
a3295dd666 Detect n_gqa and prompt template for wizardlm-70b 2023-08-09 10:51:16 -07:00
oobabooga
a4e48cbdb6 Bump AutoGPTQ 2023-08-09 08:31:17 -07:00
oobabooga
7c1300fab5 Pin aiofiles version to fix statvfs issue 2023-08-09 08:07:55 -07:00
oobabooga
6c6a52aaad Change the filenames for caches and histories 2023-08-09 07:47:19 -07:00
oobabooga
2255349f19 Update README 2023-08-09 05:46:25 -07:00
GiganticPrime
5bfcfcfc5a
Added the logic for starchat model series (#3185) 2023-08-09 09:26:12 -03:00
oobabooga
d8fb506aff Add RoPE scaling support for transformers (including dynamic NTK)
https://github.com/huggingface/transformers/pull/24653
2023-08-08 21:25:48 -07:00
Hans Raaf
f4caaf337a
Fix superbooga when using regenerate (#3362) 2023-08-08 23:26:28 -03:00
Friedemann Lipphardt
901b028d55
Add option for named cloudflare tunnels (#3364) 2023-08-08 22:20:27 -03:00
oobabooga
4ba30f6765 Add OpenChat template 2023-08-08 14:10:04 -07:00
oobabooga
bf08b16b32 Fix disappearing profile picture bug 2023-08-08 14:09:01 -07:00
Gennadij
0e78f3b4d4
Fixed a typo in "rms_norm_eps", incorrectly set as n_gqa (#3494) 2023-08-08 00:31:11 -03:00
oobabooga
37fb719452
Increase the Context/Greeting boxes sizes 2023-08-08 00:09:00 -03:00
oobabooga
6d354bb50b
Allow the webui to do multiple tasks simultaneously 2023-08-07 23:57:25 -03:00
oobabooga
584dd33424
Fix missing example_dialogue when uploading characters 2023-08-07 23:44:59 -03:00
oobabooga
bbe4a29a25
Add back dark theme code 2023-08-07 23:03:09 -03:00
oobabooga
2d0634cd07 Bump transformers commit for positive prompts 2023-08-07 08:57:19 -07:00
Sam
3b27404865
Make dockerfile respect specified cuda version (#3474) 2023-08-07 10:19:16 -03:00
oobabooga
412f6ff9d3 Change alpha_value maximum and step 2023-08-07 06:08:51 -07:00
oobabooga
a373c96d59 Fix a bug in modules/shared.py 2023-08-06 20:36:35 -07:00
jllllll
2cf64474f2
Use chat_instruct_command in API (#3482) 2023-08-06 23:46:25 -03:00
oobabooga
3d48933f27 Remove ancient deprecation warnings 2023-08-06 18:58:59 -07:00
oobabooga
c237ce607e Move characters/instruction-following to instruction-templates 2023-08-06 17:50:32 -07:00
oobabooga
65aa11890f
Refactor everything (#3481) 2023-08-06 21:49:27 -03:00
oobabooga
d4b851bdc8 Credit turboderp 2023-08-06 13:43:15 -07:00
oobabooga
0af10ab49b
Add Classifier Free Guidance (CFG) for Transformers/ExLlama (#3325) 2023-08-06 17:22:48 -03:00
missionfloyd
5134878344
Fix chat message order (#3461) 2023-08-05 13:53:54 -03:00
jllllll
44f31731af
Create logs dir if missing when saving history (#3462) 2023-08-05 13:47:16 -03:00
jllllll
5ee95d126c
Bump exllama wheels to 0.0.10 (#3467) 2023-08-05 13:46:14 -03:00
Forkoz
9dcb37e8d4
Fix: Mirostat fails on models split across multiple GPUs 2023-08-05 13:45:47 -03:00