oobabooga
|
8930bfc5f4
|
Bump PyTorch, ExLlamaV2, flash-attention (#6122)
|
2024-06-13 20:38:31 -03:00 |
|
oobabooga
|
bd7cc4234d
|
Backend cleanup (#6025)
|
2024-05-21 13:32:02 -03:00 |
|
dependabot[bot]
|
2de586f586
|
Update accelerate requirement from ==0.27.* to ==0.30.* (#5989)
|
2024-05-19 20:03:18 -03:00 |
|
oobabooga
|
0d90b3a25c
|
Bump llama-cpp-python to 0.2.75
|
2024-05-18 05:26:26 -07:00 |
|
oobabooga
|
9557f49f2f
|
Bump llama-cpp-python to 0.2.73
|
2024-05-11 10:53:19 -07:00 |
|
oobabooga
|
e61055253c
|
Bump llama-cpp-python to 0.2.69, add --flash-attn option
|
2024-05-03 04:31:22 -07:00 |
|
oobabooga
|
0476f9fe70
|
Bump ExLlamaV2 to 0.0.20
|
2024-05-01 16:20:50 -07:00 |
|
oobabooga
|
ae0f28530c
|
Bump llama-cpp-python to 0.2.68
|
2024-05-01 08:40:50 -07:00 |
|
oobabooga
|
51fb766bea
|
Add back my llama-cpp-python wheels, bump to 0.2.65 (#5964)
|
2024-04-30 09:11:31 -03:00 |
|
oobabooga
|
9b623b8a78
|
Bump llama-cpp-python to 0.2.64, use official wheels (#5921)
|
2024-04-23 23:17:05 -03:00 |
|
Ashley Kleynhans
|
0877741b03
|
Bumped ExLlamaV2 to version 0.0.19 to resolve #5851 (#5880)
|
2024-04-19 19:04:40 -03:00 |
|
oobabooga
|
b30bce3b2f
|
Bump transformers to 4.40
|
2024-04-18 16:19:31 -07:00 |
|
Philipp Emanuel Weidmann
|
a0c69749e6
|
Revert sse-starlette version bump because it breaks API request cancellation (#5873)
|
2024-04-18 15:05:00 -03:00 |
|
dependabot[bot]
|
597556cb77
|
Bump sse-starlette from 1.6.5 to 2.1.0 (#5831)
|
2024-04-11 18:54:05 -03:00 |
|
oobabooga
|
3e3a7c4250
|
Bump llama-cpp-python to 0.2.61 & fix the crash
|
2024-04-11 14:15:34 -07:00 |
|
oobabooga
|
5f5ceaf025
|
Revert "Bump llama-cpp-python to 0.2.61"
This reverts commit 3ae61c0338 .
|
2024-04-11 13:24:57 -07:00 |
|
dependabot[bot]
|
bd71a504b8
|
Update gradio requirement from ==4.25.* to ==4.26.* (#5832)
|
2024-04-11 02:24:53 -03:00 |
|
oobabooga
|
3ae61c0338
|
Bump llama-cpp-python to 0.2.61
|
2024-04-10 21:39:46 -07:00 |
|
oobabooga
|
ed4001e324
|
Bump ExLlamaV2 to 0.0.18
|
2024-04-08 18:05:16 -07:00 |
|
oobabooga
|
f6828de3f2
|
Downgrade llama-cpp-python to 0.2.56
|
2024-04-07 07:00:12 -07:00 |
|
Jared Van Bortel
|
39ff9c9dcf
|
requirements: add psutil (#5819)
|
2024-04-06 23:02:20 -03:00 |
|
oobabooga
|
dfb01f9a63
|
Bump llama-cpp-python to 0.2.60
|
2024-04-06 18:32:36 -07:00 |
|
oobabooga
|
14f6194211
|
Bump Gradio to 4.25
|
2024-04-05 09:22:44 -07:00 |
|
oobabooga
|
3952560da8
|
Bump llama-cpp-python to 0.2.59
|
2024-04-04 11:20:48 -07:00 |
|
oobabooga
|
70c58b5fc2
|
Bump ExLlamaV2 to 0.0.17
|
2024-03-30 21:08:26 -07:00 |
|
oobabooga
|
3ce0d9221b
|
Bump transformers to 4.39
|
2024-03-28 19:40:31 -07:00 |
|
oobabooga
|
2a92a842ce
|
Bump gradio to 4.23 (#5758)
|
2024-03-26 16:32:20 -03:00 |
|
oobabooga
|
a102c704f5
|
Add numba to requirements.txt
|
2024-03-10 16:13:29 -07:00 |
|
oobabooga
|
b3ade5832b
|
Keep AQLM only for Linux (fails to install on Windows)
|
2024-03-10 09:41:17 -07:00 |
|
oobabooga
|
67b24b0b88
|
Bump llama-cpp-python to 0.2.56
|
2024-03-10 09:07:27 -07:00 |
|
oobabooga
|
0e6eb7c27a
|
Add AQLM support (transformers loader) (#5466)
|
2024-03-08 17:30:36 -03:00 |
|
oobabooga
|
2174958362
|
Revert gradio to 3.50.2 (#5640)
|
2024-03-06 11:52:46 -03:00 |
|
oobabooga
|
03f03af535
|
Revert "Update peft requirement from ==0.8.* to ==0.9.* (#5626)"
This reverts commit 72a498ddd4 .
|
2024-03-05 02:56:37 -08:00 |
|
oobabooga
|
ae12d045ea
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2024-03-05 02:35:04 -08:00 |
|
dependabot[bot]
|
72a498ddd4
|
Update peft requirement from ==0.8.* to ==0.9.* (#5626)
|
2024-03-05 07:34:32 -03:00 |
|
oobabooga
|
1437f757a1
|
Bump HQQ to 0.1.5
|
2024-03-05 02:33:51 -08:00 |
|
oobabooga
|
63a1d4afc8
|
Bump gradio to 4.19 (#5522)
|
2024-03-05 07:32:28 -03:00 |
|
oobabooga
|
527ba98105
|
Do not install extensions requirements by default (#5621)
|
2024-03-04 04:46:39 -03:00 |
|
oobabooga
|
24e86bb21b
|
Bump llama-cpp-python to 0.2.55
|
2024-03-03 12:14:48 -08:00 |
|
oobabooga
|
314e42fd98
|
Fix transformers requirement
|
2024-03-03 10:49:28 -08:00 |
|
oobabooga
|
71b1617c1b
|
Remove bitsandbytes from incompatible requirements.txt files
|
2024-03-03 08:24:54 -08:00 |
|
dependabot[bot]
|
dfdf6eb5b4
|
Bump hqq from 0.1.3 to 0.1.3.post1 (#5582)
|
2024-02-26 20:51:39 -03:00 |
|
oobabooga
|
332957ffec
|
Bump llama-cpp-python to 0.2.52
|
2024-02-26 15:05:53 -08:00 |
|
Bartowski
|
21acf504ce
|
Bump transformers to 4.38 for gemma compatibility (#5575)
|
2024-02-25 20:15:13 -03:00 |
|
oobabooga
|
c07dc56736
|
Bump llama-cpp-python to 0.2.50
|
2024-02-24 21:34:11 -08:00 |
|
oobabooga
|
98580cad8e
|
Bump exllamav2 to 0.0.14
|
2024-02-24 18:35:42 -08:00 |
|
oobabooga
|
527f2652af
|
Bump llama-cpp-python to 0.2.47
|
2024-02-22 19:48:49 -08:00 |
|
dependabot[bot]
|
5f7dbf454a
|
Update optimum requirement from ==1.16.* to ==1.17.* (#5548)
|
2024-02-19 19:15:21 -03:00 |
|
dependabot[bot]
|
ed6ff49431
|
Update accelerate requirement from ==0.25.* to ==0.27.* (#5546)
|
2024-02-19 19:14:04 -03:00 |
|
oobabooga
|
0b2279d031
|
Bump llama-cpp-python to 0.2.44
|
2024-02-19 13:42:31 -08:00 |
|
oobabooga
|
c375c753d6
|
Bump bitsandbytes to 0.42 (Linux only)
|
2024-02-16 10:47:57 -08:00 |
|
oobabooga
|
080f7132c0
|
Revert gradio to 3.50.2 (#5513)
|
2024-02-15 20:40:23 -03:00 |
|
oobabooga
|
ea0e1feee7
|
Bump llama-cpp-python to 0.2.43
|
2024-02-14 21:58:24 -08:00 |
|
oobabooga
|
549f106879
|
Bump ExLlamaV2 to v0.0.13.2
|
2024-02-14 21:57:48 -08:00 |
|
DominikKowalczyk
|
33c4ce0720
|
Bump gradio to 4.19 (#5419)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2024-02-14 23:28:26 -03:00 |
|
oobabooga
|
25b655faeb
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2024-02-13 15:49:53 -08:00 |
|
oobabooga
|
f99f1fc68e
|
Bump llama-cpp-python to 0.2.42
|
2024-02-13 15:49:20 -08:00 |
|
dependabot[bot]
|
d8081e85ec
|
Update peft requirement from ==0.7.* to ==0.8.* (#5446)
|
2024-02-13 16:27:18 -03:00 |
|
dependabot[bot]
|
653b195b1e
|
Update numpy requirement from ==1.24.* to ==1.26.* (#5490)
|
2024-02-13 16:26:35 -03:00 |
|
dependabot[bot]
|
147b4cf3e0
|
Bump hqq from 0.1.2.post1 to 0.1.3 (#5489)
|
2024-02-13 16:25:02 -03:00 |
|
oobabooga
|
e9fea353c5
|
Bump llama-cpp-python to 0.2.40
|
2024-02-13 11:22:34 -08:00 |
|
oobabooga
|
acea6a6669
|
Add more exllamav2 wheels
|
2024-02-07 08:24:29 -08:00 |
|
oobabooga
|
35537ad3d1
|
Bump exllamav2 to 0.0.13.1 (#5463)
|
2024-02-07 13:17:04 -03:00 |
|
oobabooga
|
b8e25e8678
|
Bump llama-cpp-python to 0.2.39
|
2024-02-07 06:50:47 -08:00 |
|
oobabooga
|
a210999255
|
Bump safetensors version
|
2024-02-04 18:40:25 -08:00 |
|
oobabooga
|
e98d1086f5
|
Bump llama-cpp-python to 0.2.38 (#5420)
|
2024-02-01 20:09:30 -03:00 |
|
oobabooga
|
89f6036e98
|
Bump llama-cpp-python, remove python 3.8/3.9, cuda 11.7 (#5397)
|
2024-01-30 13:19:20 -03:00 |
|
dependabot[bot]
|
bfe2326a24
|
Bump hqq from 0.1.2 to 0.1.2.post1 (#5349)
|
2024-01-26 11:10:18 -03:00 |
|
oobabooga
|
87dc421ee8
|
Bump exllamav2 to 0.0.12 (#5352)
|
2024-01-22 22:40:12 -03:00 |
|
oobabooga
|
b9d1873301
|
Bump transformers to 4.37
|
2024-01-22 04:07:12 -08:00 |
|
oobabooga
|
b5cabb6e9d
|
Bump llama-cpp-python to 0.2.31 (#5345)
|
2024-01-22 08:05:59 -03:00 |
|
oobabooga
|
8962bb173e
|
Bump llama-cpp-python to 0.2.29 (#5307)
|
2024-01-18 14:24:17 -03:00 |
|
oobabooga
|
7916cf863b
|
Bump transformers (necesary for e055967974 )
|
2024-01-17 12:37:31 -08:00 |
|
Rimmy J
|
d80b191b1c
|
Add requirement jinja2==3.1.* to fix error as described in issue #5240 (#5249)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
Co-authored-by: Rim <anonymous@mail.com>
|
2024-01-13 21:47:13 -03:00 |
|
dependabot[bot]
|
32cdc66cf1
|
Bump hqq from 0.1.1.post1 to 0.1.2 (#5204)
|
2024-01-08 22:51:44 -03:00 |
|
oobabooga
|
f6a204d7c9
|
Bump llama-cpp-python to 0.2.26
|
2024-01-03 11:06:36 -08:00 |
|
oobabooga
|
29b0f14d5a
|
Bump llama-cpp-python to 0.2.25 (#5077)
|
2023-12-25 12:36:32 -03:00 |
|
oobabooga
|
d76b00c211
|
Pin lm_eval package version
|
2023-12-24 09:22:31 -08:00 |
|
oobabooga
|
f0f6d9bdf9
|
Add HQQ back & update version
This reverts commit 2289e9031e .
|
2023-12-20 07:46:09 -08:00 |
|
oobabooga
|
258c695ead
|
Add rich requirement
|
2023-12-19 21:58:36 -08:00 |
|
oobabooga
|
2289e9031e
|
Remove HQQ from requirements (after https://github.com/oobabooga/text-generation-webui/issues/4993)
|
2023-12-19 21:33:49 -08:00 |
|
oobabooga
|
0a299d5959
|
Bump llama-cpp-python to 0.2.24 (#5001)
|
2023-12-19 15:22:21 -03:00 |
|
dependabot[bot]
|
9e48e50428
|
Update optimum requirement from ==1.15.* to ==1.16.* (#4986)
|
2023-12-18 21:43:29 -03:00 |
|
Water
|
674be9a09a
|
Add HQQ quant loader (#4888)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2023-12-18 21:23:16 -03:00 |
|
oobabooga
|
12690d3ffc
|
Better HF grammar implementation (#4953)
|
2023-12-17 02:01:23 -03:00 |
|
oobabooga
|
d2ed0a06bf
|
Bump ExLlamav2 to 0.0.11 (adds Mixtral support)
|
2023-12-16 16:34:15 -08:00 |
|
oobabooga
|
85816898f9
|
Bump llama-cpp-python to 0.2.23 (including Linux ROCm and MacOS >= 12) (#4930)
|
2023-12-15 01:58:08 -03:00 |
|
oobabooga
|
21a5bfc67f
|
Relax optimum requirement
|
2023-12-12 14:05:58 -08:00 |
|
dependabot[bot]
|
7a987417bb
|
Bump optimum from 1.14.0 to 1.15.0 (#4885)
|
2023-12-12 02:32:19 -03:00 |
|
dependabot[bot]
|
a17750db91
|
Update peft requirement from ==0.6.* to ==0.7.* (#4886)
|
2023-12-12 02:31:30 -03:00 |
|
dependabot[bot]
|
a8a92c6c87
|
Update transformers requirement from ==4.35.* to ==4.36.* (#4882)
|
2023-12-12 02:30:25 -03:00 |
|
俞航
|
ac9f154bcc
|
Bump exllamav2 from 0.0.8 to 0.0.10 & Fix code change (#4782)
|
2023-12-04 21:15:05 -03:00 |
|
dependabot[bot]
|
801ba87c68
|
Update accelerate requirement from ==0.24.* to ==0.25.* (#4810)
|
2023-12-04 20:36:01 -03:00 |
|
dependabot[bot]
|
2e83844f35
|
Bump safetensors from 0.4.0 to 0.4.1 (#4750)
|
2023-12-03 22:50:10 -03:00 |
|
oobabooga
|
0589ff5b12
|
Bump llama-cpp-python to 0.2.19 & add min_p and typical_p parameters to llama.cpp loader (#4701)
|
2023-11-21 20:59:39 -03:00 |
|
oobabooga
|
e0ca49ed9c
|
Bump llama-cpp-python to 0.2.18 (2nd attempt) (#4637)
* Update requirements*.txt
* Add back seed
|
2023-11-18 00:31:27 -03:00 |
|
oobabooga
|
9d6f79db74
|
Revert "Bump llama-cpp-python to 0.2.18 (#4611)"
This reverts commit 923c8e25fb .
|
2023-11-17 05:14:25 -08:00 |
|
oobabooga
|
923c8e25fb
|
Bump llama-cpp-python to 0.2.18 (#4611)
|
2023-11-16 22:55:14 -03:00 |
|
oobabooga
|
dea90c7b67
|
Bump exllamav2 to 0.0.8
|
2023-11-13 10:34:10 -08:00 |
|
oobabooga
|
2af7e382b1
|
Revert "Bump llama-cpp-python to 0.2.14"
This reverts commit 5c3eb22ce6 .
The new version has issues:
https://github.com/oobabooga/text-generation-webui/issues/4540
https://github.com/abetlen/llama-cpp-python/issues/893
|
2023-11-09 10:02:13 -08:00 |
|