oobabooga
|
5f5ceaf025
|
Revert "Bump llama-cpp-python to 0.2.61"
This reverts commit 3ae61c0338 .
|
2024-04-11 13:24:57 -07:00 |
|
dependabot[bot]
|
bd71a504b8
|
Update gradio requirement from ==4.25.* to ==4.26.* (#5832)
|
2024-04-11 02:24:53 -03:00 |
|
oobabooga
|
3ae61c0338
|
Bump llama-cpp-python to 0.2.61
|
2024-04-10 21:39:46 -07:00 |
|
oobabooga
|
ed4001e324
|
Bump ExLlamaV2 to 0.0.18
|
2024-04-08 18:05:16 -07:00 |
|
oobabooga
|
f6828de3f2
|
Downgrade llama-cpp-python to 0.2.56
|
2024-04-07 07:00:12 -07:00 |
|
Jared Van Bortel
|
39ff9c9dcf
|
requirements: add psutil (#5819)
|
2024-04-06 23:02:20 -03:00 |
|
oobabooga
|
dfb01f9a63
|
Bump llama-cpp-python to 0.2.60
|
2024-04-06 18:32:36 -07:00 |
|
dependabot[bot]
|
a4c67e1974
|
Bump aqlm[cpu,gpu] from 1.1.2 to 1.1.3 (#5790)
|
2024-04-05 13:26:49 -03:00 |
|
oobabooga
|
14f6194211
|
Bump Gradio to 4.25
|
2024-04-05 09:22:44 -07:00 |
|
oobabooga
|
d423021a48
|
Remove CTransformers support (#5807)
|
2024-04-04 20:23:58 -03:00 |
|
oobabooga
|
3952560da8
|
Bump llama-cpp-python to 0.2.59
|
2024-04-04 11:20:48 -07:00 |
|
oobabooga
|
70c58b5fc2
|
Bump ExLlamaV2 to 0.0.17
|
2024-03-30 21:08:26 -07:00 |
|
oobabooga
|
3ce0d9221b
|
Bump transformers to 4.39
|
2024-03-28 19:40:31 -07:00 |
|
dependabot[bot]
|
3609ea69e4
|
Bump aqlm[cpu,gpu] from 1.1.0 to 1.1.2 (#5728)
|
2024-03-26 16:36:16 -03:00 |
|
oobabooga
|
2a92a842ce
|
Bump gradio to 4.23 (#5758)
|
2024-03-26 16:32:20 -03:00 |
|
oobabooga
|
a102c704f5
|
Add numba to requirements.txt
|
2024-03-10 16:13:29 -07:00 |
|
oobabooga
|
b3ade5832b
|
Keep AQLM only for Linux (fails to install on Windows)
|
2024-03-10 09:41:17 -07:00 |
|
oobabooga
|
67b24b0b88
|
Bump llama-cpp-python to 0.2.56
|
2024-03-10 09:07:27 -07:00 |
|
oobabooga
|
763f9beb7e
|
Bump bitsandbytes to 0.43, add official Windows wheel
|
2024-03-10 08:30:53 -07:00 |
|
oobabooga
|
9271e80914
|
Add back AutoAWQ for Windows
https://github.com/casper-hansen/AutoAWQ/issues/377#issuecomment-1986440695
|
2024-03-08 14:54:56 -08:00 |
|
oobabooga
|
d0663bae31
|
Bump AutoAWQ to 0.2.3 (Linux only) (#5658)
|
2024-03-08 17:36:28 -03:00 |
|
oobabooga
|
0e6eb7c27a
|
Add AQLM support (transformers loader) (#5466)
|
2024-03-08 17:30:36 -03:00 |
|
oobabooga
|
bde7f00cae
|
Change the exllamav2 version number
|
2024-03-06 21:08:29 -08:00 |
|
oobabooga
|
2ec1d96c91
|
Add cache_4bit option for ExLlamaV2 (#5645)
|
2024-03-06 23:02:25 -03:00 |
|
oobabooga
|
2174958362
|
Revert gradio to 3.50.2 (#5640)
|
2024-03-06 11:52:46 -03:00 |
|
oobabooga
|
03f03af535
|
Revert "Update peft requirement from ==0.8.* to ==0.9.* (#5626)"
This reverts commit 72a498ddd4 .
|
2024-03-05 02:56:37 -08:00 |
|
oobabooga
|
ae12d045ea
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2024-03-05 02:35:04 -08:00 |
|
dependabot[bot]
|
72a498ddd4
|
Update peft requirement from ==0.8.* to ==0.9.* (#5626)
|
2024-03-05 07:34:32 -03:00 |
|
oobabooga
|
1437f757a1
|
Bump HQQ to 0.1.5
|
2024-03-05 02:33:51 -08:00 |
|
oobabooga
|
63a1d4afc8
|
Bump gradio to 4.19 (#5522)
|
2024-03-05 07:32:28 -03:00 |
|
oobabooga
|
527ba98105
|
Do not install extensions requirements by default (#5621)
|
2024-03-04 04:46:39 -03:00 |
|
oobabooga
|
8bd4960d05
|
Update PyTorch to 2.2 (also update flash-attn to 2.5.6) (#5618)
|
2024-03-03 19:40:32 -03:00 |
|
oobabooga
|
70047a5c57
|
Bump bitsandytes to 0.42.0 on Windows
|
2024-03-03 13:19:27 -08:00 |
|
oobabooga
|
24e86bb21b
|
Bump llama-cpp-python to 0.2.55
|
2024-03-03 12:14:48 -08:00 |
|
oobabooga
|
314e42fd98
|
Fix transformers requirement
|
2024-03-03 10:49:28 -08:00 |
|
dependabot[bot]
|
dfdf6eb5b4
|
Bump hqq from 0.1.3 to 0.1.3.post1 (#5582)
|
2024-02-26 20:51:39 -03:00 |
|
oobabooga
|
332957ffec
|
Bump llama-cpp-python to 0.2.52
|
2024-02-26 15:05:53 -08:00 |
|
Bartowski
|
21acf504ce
|
Bump transformers to 4.38 for gemma compatibility (#5575)
|
2024-02-25 20:15:13 -03:00 |
|
oobabooga
|
c07dc56736
|
Bump llama-cpp-python to 0.2.50
|
2024-02-24 21:34:11 -08:00 |
|
oobabooga
|
98580cad8e
|
Bump exllamav2 to 0.0.14
|
2024-02-24 18:35:42 -08:00 |
|
oobabooga
|
527f2652af
|
Bump llama-cpp-python to 0.2.47
|
2024-02-22 19:48:49 -08:00 |
|
oobabooga
|
3f42e3292a
|
Revert "Bump autoawq from 0.1.8 to 0.2.2 (#5547)"
This reverts commit d04fef6a07 .
|
2024-02-22 19:48:04 -08:00 |
|
dependabot[bot]
|
5f7dbf454a
|
Update optimum requirement from ==1.16.* to ==1.17.* (#5548)
|
2024-02-19 19:15:21 -03:00 |
|
dependabot[bot]
|
d04fef6a07
|
Bump autoawq from 0.1.8 to 0.2.2 (#5547)
|
2024-02-19 19:14:55 -03:00 |
|
dependabot[bot]
|
ed6ff49431
|
Update accelerate requirement from ==0.25.* to ==0.27.* (#5546)
|
2024-02-19 19:14:04 -03:00 |
|
oobabooga
|
0b2279d031
|
Bump llama-cpp-python to 0.2.44
|
2024-02-19 13:42:31 -08:00 |
|
oobabooga
|
c375c753d6
|
Bump bitsandbytes to 0.42 (Linux only)
|
2024-02-16 10:47:57 -08:00 |
|
oobabooga
|
080f7132c0
|
Revert gradio to 3.50.2 (#5513)
|
2024-02-15 20:40:23 -03:00 |
|
oobabooga
|
ea0e1feee7
|
Bump llama-cpp-python to 0.2.43
|
2024-02-14 21:58:24 -08:00 |
|
oobabooga
|
549f106879
|
Bump ExLlamaV2 to v0.0.13.2
|
2024-02-14 21:57:48 -08:00 |
|