Commit Graph

80 Commits

Author SHA1 Message Date
oobabooga
ac30e7fe9c Updater: don't reinstall requirements if no updates after git pull 2024-07-24 19:03:34 -07:00
oobabooga
fd7c3c5bb0 Don't git pull on installation (to make past releases installable) 2024-06-15 06:38:05 -07:00
oobabooga
9420973b62
Downgrade PyTorch to 2.2.2 (#6124) 2024-06-14 16:42:03 -03:00
oobabooga
8930bfc5f4
Bump PyTorch, ExLlamaV2, flash-attention (#6122) 2024-06-13 20:38:31 -03:00
oobabooga
bd7cc4234d
Backend cleanup (#6025) 2024-05-21 13:32:02 -03:00
oobabooga
51fb766bea
Add back my llama-cpp-python wheels, bump to 0.2.65 (#5964) 2024-04-30 09:11:31 -03:00
oobabooga
9b623b8a78
Bump llama-cpp-python to 0.2.64, use official wheels (#5921) 2024-04-23 23:17:05 -03:00
oobabooga
bef08129bc Small fix for cuda 11.8 in the one-click installer 2024-03-06 21:43:36 -08:00
oobabooga
303433001f Fix a check in the installer 2024-03-06 21:13:54 -08:00
oobabooga
fa0e68cefd Installer: add back INSTALL_EXTENSIONS environment variable (for docker) 2024-03-06 11:31:06 -08:00
oobabooga
fcc92caa30 Installer: add option to install requirements for just one extension 2024-03-06 07:36:23 -08:00
oobabooga
3cfcab63a5 Update an installation message 2024-03-04 20:37:44 -08:00
oobabooga
f697cb4609 Move update_wizard_windows.sh to update_wizard_windows.bat (oops) 2024-03-04 19:26:24 -08:00
oobabooga
2d74660733 Don't git pull on "Install/update extensions requirements" 2024-03-04 12:37:10 -08:00
oobabooga
fbe83854ca Minor message change 2024-03-04 11:10:37 -08:00
oobabooga
90ab022856 Minor message change 2024-03-04 10:54:16 -08:00
oobabooga
97dc3602fc
Create an update wizard (#5623) 2024-03-04 15:52:24 -03:00
oobabooga
6adf222599 One-click installer: change an info message 2024-03-04 08:20:04 -08:00
oobabooga
4bb79c57ac One-click installer: change an info message 2024-03-04 08:11:55 -08:00
oobabooga
dc2dd5b9d8 One-click installer: add an info message before git pull 2024-03-04 08:00:39 -08:00
oobabooga
527ba98105
Do not install extensions requirements by default (#5621) 2024-03-04 04:46:39 -03:00
oobabooga
fa4ce0eee8 One-click installer: minor change to CMD_FLAGS.txt in CPU mode 2024-03-03 17:42:59 -08:00
oobabooga
8bd4960d05
Update PyTorch to 2.2 (also update flash-attn to 2.5.6) (#5618) 2024-03-03 19:40:32 -03:00
oobabooga
89f6036e98
Bump llama-cpp-python, remove python 3.8/3.9, cuda 11.7 (#5397) 2024-01-30 13:19:20 -03:00
oobabooga
d921f80322 one-click: minor fix after 5e87678fea 2024-01-28 06:14:15 -08:00
Evgenii
26c3ab367e
one-click: use f-strings to improve readability and unify with the rest code (#5068) 2024-01-27 17:31:22 -03:00
Andrew C. Dvorak
5e87678fea
Support running as a git submodule. (#5227) 2024-01-27 17:18:50 -03:00
oobabooga
c4c7fc4ab3 Lint 2024-01-07 09:36:56 -08:00
Yilong Guo
d93db3b486
Refine ipex setup (#5191) 2024-01-07 10:40:30 -03:00
oobabooga
9e86bea8e9 Use requirements_cpu.txt for intel 2024-01-04 18:52:14 -08:00
oobabooga
3d854ee516
Pin PyTorch version to 2.1 (#5056) 2024-01-04 23:50:23 -03:00
Matthew Raaff
c9c31f71b8
Various one-click installer improvements (#4994)
---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2024-01-04 23:41:54 -03:00
oobabooga
0e54a09bcb
Remove exllamav1 loaders (#5128) 2023-12-31 01:57:06 -03:00
Song Fuchang
127c71a22a
Update IPEX to 2.1.10+xpu (#4931)
* This will require Intel oneAPI Toolkit 2024.0
2023-12-15 03:19:01 -03:00
oobabooga
dde7921057 One-click installer: minor message change 2023-12-14 17:27:32 -08:00
oobabooga
fd1449de20 One-click installer: fix minor bug introduced in previous commit 2023-12-14 16:52:44 -08:00
oobabooga
4ae2dcebf5 One-click installer: more friendly progress messages 2023-12-14 16:48:00 -08:00
Song Fuchang
e16e5997ef
Update IPEX install URL. (#4825)
* Old pip url no longer works. Use the latest url from
  * https://intel.github.io/intel-extension-for-pytorch/index.html#installation
2023-12-06 21:07:01 -03:00
erew123
f786aa3caa
Clean-up Ctrl+C Shutdown (#4802) 2023-12-05 02:16:16 -03:00
oobabooga
8d811a4d58 one-click: move on instead of crashing if extension fails to install 2023-11-21 16:09:44 -08:00
oobabooga
0047d9f5e0 Do not install coqui_tts requirements by default
It breaks the one-click installer on Windows.
2023-11-21 15:13:42 -08:00
oobabooga
fb124ab6e2 Bump to flash-attention 2.3.4 + switch to Github Actions wheels on Windows (#4700) 2023-11-21 15:07:17 -08:00
oobabooga
9d6f79db74 Revert "Bump llama-cpp-python to 0.2.18 (#4611)"
This reverts commit 923c8e25fb.
2023-11-17 05:14:25 -08:00
oobabooga
b2ce8dc7ee Update a message 2023-11-16 18:46:26 -08:00
oobabooga
780b00e1cf Minor bug fix 2023-11-16 18:39:39 -08:00
oobabooga
923c8e25fb
Bump llama-cpp-python to 0.2.18 (#4611) 2023-11-16 22:55:14 -03:00
oobabooga
e7d460d932 Make sure that API requirements are installed 2023-11-16 10:08:41 -08:00
oobabooga
cbf2b47476 Strip trailing "\" characters in CMD_FLAGS.txt 2023-11-16 09:33:36 -08:00
oobabooga
4f9bc63edf Installer: update a message for clarity 2023-11-10 09:43:02 -08:00
Abhilash Majumder
778a010df8
Intel Gpu support initialization (#4340) 2023-10-26 23:39:51 -03:00