Commit Graph

119 Commits

Author SHA1 Message Date
jllllll
1610d5ffb2
Bump exllama module to 0.0.5 (#2993) 2023-07-04 00:15:55 -03:00
oobabooga
c6cae106e7 Bump llama-cpp-python 2023-06-28 18:14:45 -03:00
jllllll
7b048dcf67
Bump exllama module version to 0.0.4 (#2915) 2023-06-28 18:09:58 -03:00
jllllll
bef67af23c
Use pre-compiled python module for ExLlama (#2770) 2023-06-24 20:24:17 -03:00
jllllll
a06acd6d09
Update bitsandbytes to 0.39.1 (#2799) 2023-06-21 15:04:45 -03:00
oobabooga
c623e142ac Bump llama-cpp-python 2023-06-20 00:49:38 -03:00
oobabooga
490a1795f0 Bump peft commit 2023-06-18 16:42:11 -03:00
dependabot[bot]
909d8c6ae3
Bump transformers from 4.30.0 to 4.30.2 (#2695) 2023-06-14 19:56:28 -03:00
oobabooga
ea0eabd266 Bump llama-cpp-python version 2023-06-10 21:59:29 -03:00
oobabooga
0f8140e99d Bump transformers/accelerate/peft/autogptq 2023-06-09 00:25:13 -03:00
oobabooga
5d515eeb8c Bump llama-cpp-python wheel 2023-06-06 13:01:15 -03:00
dependabot[bot]
97f3fa843f
Bump llama-cpp-python from 0.1.56 to 0.1.57 (#2537) 2023-06-05 23:45:58 -03:00
oobabooga
4e9937aa99 Bump gradio 2023-06-05 17:29:21 -03:00
jllllll
5216117a63
Fix MacOS incompatibility in requirements.txt (#2485) 2023-06-02 01:46:16 -03:00
oobabooga
b4ad060c1f Use cuda 11.7 instead of 11.8 2023-06-02 01:04:44 -03:00
oobabooga
d0aca83b53 Add AutoGPTQ wheels to requirements.txt 2023-06-02 00:47:11 -03:00
oobabooga
2cdf525d3b Bump llama-cpp-python version 2023-05-31 23:29:02 -03:00
Honkware
204731952a
Falcon support (trust-remote-code and autogptq checkboxes) (#2367)
---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-05-29 10:20:18 -03:00
jllllll
78dbec4c4e
Add 'scipy' to requirements.txt #2335 (#2343)
Unlisted dependency of bitsandbytes
2023-05-25 23:26:25 -03:00
oobabooga
548f05e106 Add windows bitsandbytes wheel by jllllll 2023-05-25 10:48:22 -03:00
oobabooga
361451ba60
Add --load-in-4bit parameter (#2320) 2023-05-25 01:14:13 -03:00
eiery
9967e08b1f
update llama-cpp-python to v0.1.53 for ggml v3, fixes #2245 (#2264) 2023-05-24 10:25:28 -03:00
oobabooga
1490c0af68 Remove RWKV from requirements.txt 2023-05-23 20:49:20 -03:00
dependabot[bot]
baf75356d4
Bump transformers from 4.29.1 to 4.29.2 (#2268) 2023-05-22 02:50:18 -03:00
jllllll
2aa01e2303
Fix broken version of peft (#2229) 2023-05-20 17:54:51 -03:00
oobabooga
511470a89b Bump llama-cpp-python version 2023-05-19 12:13:25 -03:00
oobabooga
259020a0be Bump gradio to 3.31.0
This fixes Google Colab lagging.
2023-05-16 22:21:15 -03:00
dependabot[bot]
ae54d83455
Bump transformers from 4.28.1 to 4.29.1 (#2089) 2023-05-15 19:25:24 -03:00
feeelX
eee986348c
Update llama-cpp-python from 0.1.45 to 0.1.50 (#2058) 2023-05-14 22:41:14 -03:00
dependabot[bot]
a5bb278631
Bump accelerate from 0.18.0 to 0.19.0 (#1925) 2023-05-09 02:17:27 -03:00
oobabooga
b040b4110d Bump llama-cpp-python version 2023-05-08 00:21:17 -03:00
oobabooga
81be7c2dd4 Specify gradio_client version 2023-05-06 21:50:04 -03:00
oobabooga
60be76f0fc Revert gradio bump (gallery is broken) 2023-05-03 11:53:30 -03:00
oobabooga
d016c38640 Bump gradio version 2023-05-02 19:19:33 -03:00
dependabot[bot]
280c2f285f
Bump safetensors from 0.3.0 to 0.3.1 (#1720) 2023-05-02 00:42:39 -03:00
oobabooga
56b13d5d48 Bump llama-cpp-python version 2023-05-02 00:41:54 -03:00
oobabooga
2f6e2ddeac Bump llama-cpp-python version 2023-04-24 03:42:03 -03:00
oobabooga
c4f4f41389
Add an "Evaluate" tab to calculate the perplexities of models (#1322) 2023-04-21 00:20:33 -03:00
oobabooga
39099663a0
Add 4-bit LoRA support (#1200) 2023-04-16 23:26:52 -03:00
dependabot[bot]
4cd2a9d824
Bump transformers from 4.28.0 to 4.28.1 (#1288) 2023-04-16 21:12:57 -03:00
oobabooga
d2ea925fa5 Bump llama-cpp-python to use LlamaCache 2023-04-16 00:53:40 -03:00
catalpaaa
94700cc7a5
Bump gradio to 3.25 (#1089) 2023-04-14 23:45:25 -03:00
Alex "mcmonkey" Goodwin
64e3b44e0f
initial multi-lora support (#1103)
---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-04-14 14:52:06 -03:00
dependabot[bot]
852a5aa13d
Bump bitsandbytes from 0.37.2 to 0.38.1 (#1158) 2023-04-13 21:23:14 -03:00
dependabot[bot]
84576a80d2
Bump llama-cpp-python from 0.1.30 to 0.1.33 (#1157) 2023-04-13 21:17:59 -03:00
oobabooga
2908a51587 Settle for transformers 4.28.0 2023-04-13 21:07:00 -03:00
oobabooga
32d078487e Add llama-cpp-python to requirements.txt 2023-04-10 10:45:51 -03:00
oobabooga
d272ac46dd Add Pillow as a requirement 2023-04-08 18:48:46 -03:00
oobabooga
58ed87e5d9
Update requirements.txt 2023-04-06 18:42:54 -03:00
dependabot[bot]
21be80242e
Bump rwkv from 0.7.2 to 0.7.3 (#842) 2023-04-06 17:52:27 -03:00