Commit Graph

3291 Commits

Author SHA1 Message Date
Gennadij
e12a1852d9
Add Vicuna-v1.5 detection (#3524) 2023-08-10 13:42:24 -03:00
jllllll
28e3ce4317
Simplify GPTQ-for-LLaMa installation (#122) 2023-08-10 13:19:47 -03:00
oobabooga
e3d2ddd170
Streamline GPTQ-for-LLaMa support (#3526 from jllllll/gptqllama) 2023-08-10 12:54:59 -03:00
oobabooga
c7f52bbdc1 Revert "Remove GPTQ-for-LLaMa monkey patch support"
This reverts commit e3d3565b2a.
2023-08-10 08:39:41 -07:00
oobabooga
16e2b117b4 Minor doc change 2023-08-10 08:38:10 -07:00
jllllll
d6765bebc4
Update installation documentation 2023-08-10 00:53:48 -05:00
jllllll
d7ee4c2386
Remove unused import 2023-08-10 00:10:14 -05:00
jllllll
e3d3565b2a
Remove GPTQ-for-LLaMa monkey patch support
AutoGPTQ will be the preferred GPTQ LoRa loader in the future.
2023-08-09 23:59:04 -05:00
jllllll
bee73cedbd
Streamline GPTQ-for-LLaMa support 2023-08-09 23:42:34 -05:00
oobabooga
a3295dd666 Detect n_gqa and prompt template for wizardlm-70b 2023-08-09 10:51:16 -07:00
oobabooga
a4e48cbdb6 Bump AutoGPTQ 2023-08-09 08:31:17 -07:00
oobabooga
7c1300fab5 Pin aiofiles version to fix statvfs issue 2023-08-09 08:07:55 -07:00
oobabooga
6c6a52aaad Change the filenames for caches and histories 2023-08-09 07:47:19 -07:00
oobabooga
2255349f19 Update README 2023-08-09 05:46:25 -07:00
GiganticPrime
5bfcfcfc5a
Added the logic for starchat model series (#3185) 2023-08-09 09:26:12 -03:00
oobabooga
fa4a948b38
Allow users to write one flag per line in CMD_FLAGS.txt 2023-08-09 01:58:23 -03:00
oobabooga
d8fb506aff Add RoPE scaling support for transformers (including dynamic NTK)
https://github.com/huggingface/transformers/pull/24653
2023-08-08 21:25:48 -07:00
Hans Raaf
f4caaf337a
Fix superbooga when using regenerate (#3362) 2023-08-08 23:26:28 -03:00
Friedemann Lipphardt
901b028d55
Add option for named cloudflare tunnels (#3364) 2023-08-08 22:20:27 -03:00
oobabooga
4ba30f6765 Add OpenChat template 2023-08-08 14:10:04 -07:00
oobabooga
bf08b16b32 Fix disappearing profile picture bug 2023-08-08 14:09:01 -07:00
Gennadij
0e78f3b4d4
Fixed a typo in "rms_norm_eps", incorrectly set as n_gqa (#3494) 2023-08-08 00:31:11 -03:00
oobabooga
37fb719452
Increase the Context/Greeting boxes sizes 2023-08-08 00:09:00 -03:00
oobabooga
6d354bb50b
Allow the webui to do multiple tasks simultaneously 2023-08-07 23:57:25 -03:00
oobabooga
584dd33424
Fix missing example_dialogue when uploading characters 2023-08-07 23:44:59 -03:00
oobabooga
bbe4a29a25
Add back dark theme code 2023-08-07 23:03:09 -03:00
oobabooga
2d0634cd07 Bump transformers commit for positive prompts 2023-08-07 08:57:19 -07:00
Sam
3b27404865
Make dockerfile respect specified cuda version (#3474) 2023-08-07 10:19:16 -03:00
oobabooga
412f6ff9d3 Change alpha_value maximum and step 2023-08-07 06:08:51 -07:00
oobabooga
a373c96d59 Fix a bug in modules/shared.py 2023-08-06 20:36:35 -07:00
jllllll
2cf64474f2
Use chat_instruct_command in API (#3482) 2023-08-06 23:46:25 -03:00
oobabooga
3d48933f27 Remove ancient deprecation warnings 2023-08-06 18:58:59 -07:00
oobabooga
c237ce607e Move characters/instruction-following to instruction-templates 2023-08-06 17:50:32 -07:00
oobabooga
65aa11890f
Refactor everything (#3481) 2023-08-06 21:49:27 -03:00
oobabooga
d4b851bdc8 Credit turboderp 2023-08-06 13:43:15 -07:00
oobabooga
0af10ab49b
Add Classifier Free Guidance (CFG) for Transformers/ExLlama (#3325) 2023-08-06 17:22:48 -03:00
missionfloyd
5134878344
Fix chat message order (#3461) 2023-08-05 13:53:54 -03:00
jllllll
44f31731af
Create logs dir if missing when saving history (#3462) 2023-08-05 13:47:16 -03:00
jllllll
5ee95d126c
Bump exllama wheels to 0.0.10 (#3467) 2023-08-05 13:46:14 -03:00
Forkoz
9dcb37e8d4
Fix: Mirostat fails on models split across multiple GPUs 2023-08-05 13:45:47 -03:00
jllllll
9e17325207
Add CMD_FLAGS.txt functionality to WSL installer (#119) 2023-08-05 10:26:24 -03:00
SodaPrettyCold
23055b21ee
[Bug fix] Remove html tags form the Prompt sent to Stable Diffusion (#3151) 2023-08-04 20:20:28 -03:00
jllllll
6e30f76ba5
Bump bitsandbytes to 0.41.1 (#3457) 2023-08-04 19:28:59 -03:00
oobabooga
8df3cdfd51
Add SSL certificate support (#3453) 2023-08-04 13:57:31 -03:00
oobabooga
ed57a79c6e
Add back silero preview by @missionfloyd (#3446) 2023-08-04 02:29:14 -03:00
missionfloyd
2336b75d92
Remove unnecessary chat.js (#3445) 2023-08-04 01:58:37 -03:00
oobabooga
4b3384e353 Handle unfinished lists during markdown streaming 2023-08-03 17:15:18 -07:00
Pete
f4005164f4
Fix llama.cpp truncation (#3400)
---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-08-03 20:01:15 -03:00
oobabooga
4e6dc6d99d Add Contributing guidelines 2023-08-03 14:40:28 -07:00
matatonic
8f98268252
extensions/openai: include content-length for json replies (#3416) 2023-08-03 16:10:49 -03:00