Commit Graph

3655 Commits

Author SHA1 Message Date
oobabooga
0e6eb7c27a
Add AQLM support (transformers loader) (#5466) 2024-03-08 17:30:36 -03:00
oobabooga
2681f6f640
Make superbooga & superboogav2 functional again (#5656) 2024-03-07 15:03:18 -03:00
oobabooga
bae14c8f13 Right-truncate long chat completion prompts instead of left-truncating
Instructions are usually at the beginning of the prompt.
2024-03-07 08:50:24 -08:00
Bartowski
104573f7d4
Update cache_4bit documentation (#5649)
---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2024-03-07 13:08:21 -03:00
oobabooga
bef08129bc Small fix for cuda 11.8 in the one-click installer 2024-03-06 21:43:36 -08:00
oobabooga
303433001f Fix a check in the installer 2024-03-06 21:13:54 -08:00
oobabooga
bde7f00cae Change the exllamav2 version number 2024-03-06 21:08:29 -08:00
oobabooga
2ec1d96c91
Add cache_4bit option for ExLlamaV2 (#5645) 2024-03-06 23:02:25 -03:00
oobabooga
fa0e68cefd Installer: add back INSTALL_EXTENSIONS environment variable (for docker) 2024-03-06 11:31:06 -08:00
oobabooga
fcc92caa30 Installer: add option to install requirements for just one extension 2024-03-06 07:36:23 -08:00
oobabooga
2174958362
Revert gradio to 3.50.2 (#5640) 2024-03-06 11:52:46 -03:00
oobabooga
7eee9e9470 Add -k to curl command to download miniconda on windows (closes #5628) 2024-03-06 06:46:50 -08:00
oobabooga
03f03af535 Revert "Update peft requirement from ==0.8.* to ==0.9.* (#5626)"
This reverts commit 72a498ddd4.
2024-03-05 02:56:37 -08:00
oobabooga
d61e31e182
Save the extensions after Gradio 4 (#5632) 2024-03-05 07:54:34 -03:00
oobabooga
ae12d045ea Merge remote-tracking branch 'refs/remotes/origin/dev' into dev 2024-03-05 02:35:04 -08:00
dependabot[bot]
72a498ddd4
Update peft requirement from ==0.8.* to ==0.9.* (#5626) 2024-03-05 07:34:32 -03:00
oobabooga
1437f757a1 Bump HQQ to 0.1.5 2024-03-05 02:33:51 -08:00
oobabooga
63a1d4afc8
Bump gradio to 4.19 (#5522) 2024-03-05 07:32:28 -03:00
oobabooga
164ff2440d Use the correct PyTorch in the Colab notebook 2024-03-05 01:05:19 -08:00
oobabooga
3cfcab63a5 Update an installation message 2024-03-04 20:37:44 -08:00
oobabooga
907bda0d56 Move update_wizard_wsl.sh to update_wizard_wsl.bat 2024-03-04 19:57:49 -08:00
oobabooga
f697cb4609 Move update_wizard_windows.sh to update_wizard_windows.bat (oops) 2024-03-04 19:26:24 -08:00
oobabooga
2d74660733 Don't git pull on "Install/update extensions requirements" 2024-03-04 12:37:10 -08:00
oobabooga
fbe83854ca Minor message change 2024-03-04 11:10:37 -08:00
oobabooga
90ab022856 Minor message change 2024-03-04 10:54:16 -08:00
oobabooga
97dc3602fc
Create an update wizard (#5623) 2024-03-04 15:52:24 -03:00
oobabooga
6adf222599 One-click installer: change an info message 2024-03-04 08:20:04 -08:00
oobabooga
4bb79c57ac One-click installer: change an info message 2024-03-04 08:11:55 -08:00
oobabooga
74564fe8d0 One-click installer: delete the Miniconda installer after completion 2024-03-04 08:11:03 -08:00
oobabooga
dc2dd5b9d8 One-click installer: add an info message before git pull 2024-03-04 08:00:39 -08:00
oobabooga
527ba98105
Do not install extensions requirements by default (#5621) 2024-03-04 04:46:39 -03:00
oobabooga
fa4ce0eee8 One-click installer: minor change to CMD_FLAGS.txt in CPU mode 2024-03-03 17:42:59 -08:00
oobabooga
8bd4960d05
Update PyTorch to 2.2 (also update flash-attn to 2.5.6) (#5618) 2024-03-03 19:40:32 -03:00
oobabooga
70047a5c57 Bump bitsandytes to 0.42.0 on Windows 2024-03-03 13:19:27 -08:00
oobabooga
24e86bb21b Bump llama-cpp-python to 0.2.55 2024-03-03 12:14:48 -08:00
oobabooga
314e42fd98 Fix transformers requirement 2024-03-03 10:49:28 -08:00
oobabooga
71b1617c1b Remove bitsandbytes from incompatible requirements.txt files 2024-03-03 08:24:54 -08:00
kalomaze
cfb25c9b3f
Cubic sampling w/ curve param (#5551)
---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2024-03-03 13:22:21 -03:00
jeffbiocode
3168644152
Training: Update llama2-chat-format.json (#5593) 2024-03-03 12:42:14 -03:00
oobabooga
71dc5b4dee Merge remote-tracking branch 'refs/remotes/origin/dev' into dev 2024-02-28 19:59:20 -08:00
oobabooga
09b13acfb2 Perplexity evaluation: print to terminal after calculation is finished 2024-02-28 19:58:21 -08:00
dependabot[bot]
dfdf6eb5b4
Bump hqq from 0.1.3 to 0.1.3.post1 (#5582) 2024-02-26 20:51:39 -03:00
oobabooga
332957ffec Bump llama-cpp-python to 0.2.52 2024-02-26 15:05:53 -08:00
oobabooga
b64770805b Merge remote-tracking branch 'refs/remotes/origin/dev' into dev 2024-02-26 08:51:31 -08:00
oobabooga
830168d3d4 Revert "Replace hashlib.sha256 with hashlib.file_digest so we don't need to load entire files into ram before hashing them. (#4383)"
This reverts commit 0ced78fdfa.
2024-02-26 05:54:33 -08:00
Bartowski
21acf504ce
Bump transformers to 4.38 for gemma compatibility (#5575) 2024-02-25 20:15:13 -03:00
oobabooga
4164e29416 Block the "To create a public link, set share=True" gradio message 2024-02-25 15:06:08 -08:00
oobabooga
d34126255d Fix loading extensions with "-" in the name (closes #5557) 2024-02-25 09:24:52 -08:00
Lounger
0f68c6fb5b
Big picture fixes (#5565) 2024-02-25 14:10:16 -03:00
jeffbiocode
45c4cd01c5
Add llama 2 chat format for lora training (#5553) 2024-02-25 02:36:36 -03:00