oobabooga
|
a50477ec85
|
Apply the change to all requirements (oops)
|
2024-09-06 18:47:25 -07:00 |
|
oobabooga
|
2cb8d4c96e
|
Bump llama-cpp-python to 0.2.90
|
2024-09-03 05:53:18 -07:00 |
|
oobabooga
|
d1168afa76
|
Bump ExLlamaV2 to 0.2.0
|
2024-09-02 21:15:51 -07:00 |
|
oobabooga
|
1f288b4072
|
Bump ExLlamaV2 to 0.1.9
|
2024-08-22 12:40:15 -07:00 |
|
dependabot[bot]
|
64e16e9a46
|
Update accelerate requirement from ==0.32.* to ==0.33.* (#6291)
|
2024-08-19 23:34:10 -03:00 |
|
dependabot[bot]
|
68f928b5e0
|
Update peft requirement from ==0.8.* to ==0.12.* (#6292)
|
2024-08-19 23:33:56 -03:00 |
|
oobabooga
|
4d8c1801c2
|
Bump llama-cpp-python to 0.2.89
|
2024-08-19 17:45:01 -07:00 |
|
oobabooga
|
bf8187124d
|
Bump llama-cpp-python to 0.2.88
|
2024-08-13 12:40:18 -07:00 |
|
oobabooga
|
089d5a9415
|
Bump llama-cpp-python to 0.2.87
|
2024-08-07 20:36:28 -07:00 |
|
oobabooga
|
81773f7f36
|
Bump transformers to 4.44
|
2024-08-06 20:07:05 -07:00 |
|
oobabooga
|
608545d282
|
Bump llama-cpp-python to 0.2.85
|
2024-07-31 18:44:46 -07:00 |
|
oobabooga
|
92ab3a9a6a
|
Bump llama-cpp-python to 0.2.84
|
2024-07-28 15:13:06 -07:00 |
|
oobabooga
|
8a5f110c14
|
Bump ExLlamaV2 to 0.1.8
|
2024-07-24 09:22:48 -07:00 |
|
oobabooga
|
f66ab63d64
|
Bump transformers to 4.43
|
2024-07-23 14:06:34 -07:00 |
|
oobabooga
|
3ee682208c
|
Revert "Bump hqq from 0.1.7.post3 to 0.1.8 (#6238)"
This reverts commit 1c3671699c .
|
2024-07-22 19:53:56 -07:00 |
|
oobabooga
|
aa809e420e
|
Bump llama-cpp-python to 0.2.83, add back tensorcore wheels
Also add back the progress bar patch
|
2024-07-22 18:05:11 -07:00 |
|
oobabooga
|
11bbf71aa5
|
Bump back llama-cpp-python (#6257)
|
2024-07-22 16:19:41 -03:00 |
|
oobabooga
|
0f53a736c1
|
Revert the llama-cpp-python update
|
2024-07-22 12:02:25 -07:00 |
|
oobabooga
|
7d2449f8b0
|
Bump llama-cpp-python to 0.2.82.3 (unofficial build)
|
2024-07-22 11:49:20 -07:00 |
|
dependabot[bot]
|
1c3671699c
|
Bump hqq from 0.1.7.post3 to 0.1.8 (#6238)
|
2024-07-20 18:20:26 -03:00 |
|
dependabot[bot]
|
063d2047dd
|
Update accelerate requirement from ==0.31.* to ==0.32.* (#6217)
|
2024-07-11 19:56:42 -03:00 |
|
oobabooga
|
01e4721da7
|
Bump ExLlamaV2 to 0.1.7
|
2024-07-11 12:33:46 -07:00 |
|
oobabooga
|
fa075e41f4
|
Bump llama-cpp-python to 0.2.82
|
2024-07-10 06:03:24 -07:00 |
|
oobabooga
|
7e22eaa36c
|
Bump llama-cpp-python to 0.2.81
|
2024-07-02 20:29:35 -07:00 |
|
dependabot[bot]
|
a5df8f4e3c
|
Bump jinja2 from 3.1.2 to 3.1.4 (#6172)
|
2024-06-27 21:12:39 -03:00 |
|
dependabot[bot]
|
c6cec0588c
|
Update accelerate requirement from ==0.30.* to ==0.31.* (#6156)
|
2024-06-27 21:12:02 -03:00 |
|
oobabooga
|
66090758df
|
Bump transformers to 4.42 (for gemma support)
|
2024-06-27 11:26:02 -07:00 |
|
oobabooga
|
602b455507
|
Bump llama-cpp-python to 0.2.79
|
2024-06-24 20:26:38 -07:00 |
|
oobabooga
|
7db8b3b532
|
Bump ExLlamaV2 to 0.1.6
|
2024-06-24 05:38:11 -07:00 |
|
oobabooga
|
125bb7b03b
|
Revert "Bump llama-cpp-python to 0.2.78"
This reverts commit b6eaf7923e .
|
2024-06-23 19:54:28 -07:00 |
|
oobabooga
|
b6eaf7923e
|
Bump llama-cpp-python to 0.2.78
|
2024-06-14 21:22:09 -07:00 |
|
oobabooga
|
9420973b62
|
Downgrade PyTorch to 2.2.2 (#6124)
|
2024-06-14 16:42:03 -03:00 |
|
dependabot[bot]
|
fdd8fab9cf
|
Bump hqq from 0.1.7.post2 to 0.1.7.post3 (#6090)
|
2024-06-14 13:46:35 -03:00 |
|
oobabooga
|
8930bfc5f4
|
Bump PyTorch, ExLlamaV2, flash-attention (#6122)
|
2024-06-13 20:38:31 -03:00 |
|
oobabooga
|
bd7cc4234d
|
Backend cleanup (#6025)
|
2024-05-21 13:32:02 -03:00 |
|
dependabot[bot]
|
2de586f586
|
Update accelerate requirement from ==0.27.* to ==0.30.* (#5989)
|
2024-05-19 20:03:18 -03:00 |
|
oobabooga
|
0d90b3a25c
|
Bump llama-cpp-python to 0.2.75
|
2024-05-18 05:26:26 -07:00 |
|
oobabooga
|
9557f49f2f
|
Bump llama-cpp-python to 0.2.73
|
2024-05-11 10:53:19 -07:00 |
|
oobabooga
|
e61055253c
|
Bump llama-cpp-python to 0.2.69, add --flash-attn option
|
2024-05-03 04:31:22 -07:00 |
|
oobabooga
|
0476f9fe70
|
Bump ExLlamaV2 to 0.0.20
|
2024-05-01 16:20:50 -07:00 |
|
oobabooga
|
ae0f28530c
|
Bump llama-cpp-python to 0.2.68
|
2024-05-01 08:40:50 -07:00 |
|
oobabooga
|
51fb766bea
|
Add back my llama-cpp-python wheels, bump to 0.2.65 (#5964)
|
2024-04-30 09:11:31 -03:00 |
|
oobabooga
|
9b623b8a78
|
Bump llama-cpp-python to 0.2.64, use official wheels (#5921)
|
2024-04-23 23:17:05 -03:00 |
|
Ashley Kleynhans
|
0877741b03
|
Bumped ExLlamaV2 to version 0.0.19 to resolve #5851 (#5880)
|
2024-04-19 19:04:40 -03:00 |
|
oobabooga
|
b30bce3b2f
|
Bump transformers to 4.40
|
2024-04-18 16:19:31 -07:00 |
|
Philipp Emanuel Weidmann
|
a0c69749e6
|
Revert sse-starlette version bump because it breaks API request cancellation (#5873)
|
2024-04-18 15:05:00 -03:00 |
|
dependabot[bot]
|
597556cb77
|
Bump sse-starlette from 1.6.5 to 2.1.0 (#5831)
|
2024-04-11 18:54:05 -03:00 |
|
oobabooga
|
3e3a7c4250
|
Bump llama-cpp-python to 0.2.61 & fix the crash
|
2024-04-11 14:15:34 -07:00 |
|
oobabooga
|
5f5ceaf025
|
Revert "Bump llama-cpp-python to 0.2.61"
This reverts commit 3ae61c0338 .
|
2024-04-11 13:24:57 -07:00 |
|
dependabot[bot]
|
bd71a504b8
|
Update gradio requirement from ==4.25.* to ==4.26.* (#5832)
|
2024-04-11 02:24:53 -03:00 |
|