oobabooga
|
923c8e25fb
|
Bump llama-cpp-python to 0.2.18 (#4611)
|
2023-11-16 22:55:14 -03:00 |
|
oobabooga
|
e7d460d932
|
Make sure that API requirements are installed
|
2023-11-16 10:08:41 -08:00 |
|
oobabooga
|
cbf2b47476
|
Strip trailing "\" characters in CMD_FLAGS.txt
|
2023-11-16 09:33:36 -08:00 |
|
oobabooga
|
4f9bc63edf
|
Installer: update a message for clarity
|
2023-11-10 09:43:02 -08:00 |
|
Abhilash Majumder
|
778a010df8
|
Intel Gpu support initialization (#4340)
|
2023-10-26 23:39:51 -03:00 |
|
oobabooga
|
2d97897a25
|
Don't install flash-attention on windows + cuda 11
|
2023-10-25 11:21:18 -07:00 |
|
mongolu
|
c18504f369
|
USE_CUDA118 from ENV remains null one_click.py + cuda-toolkit (#4352)
|
2023-10-22 12:37:24 -03:00 |
|
oobabooga
|
6efb990b60
|
Add a proper documentation (#3885)
|
2023-10-21 19:15:54 -03:00 |
|
Brian Dashore
|
3345da2ea4
|
Add flash-attention 2 for windows (#4235)
|
2023-10-21 03:46:23 -03:00 |
|
oobabooga
|
258d046218
|
More robust way of initializing empty .git folder
|
2023-10-20 23:13:09 -07:00 |
|
oobabooga
|
43be1be598
|
Manually install CUDA runtime libraries
|
2023-10-12 21:02:44 -07:00 |
|
jllllll
|
0eda9a0549
|
Use GPTQ wheels compatible with Pytorch 2.1 (#4210)
|
2023-10-07 00:35:41 -03:00 |
|
oobabooga
|
d33facc9fe
|
Bump to pytorch 11.8 (#4209)
|
2023-10-07 00:23:49 -03:00 |
|
oobabooga
|
771e936769
|
Fix extensions install (2nd attempt)
|
2023-09-28 14:33:49 -07:00 |
|
oobabooga
|
822ba7fcbb
|
Better error handling during install/update
|
2023-09-28 13:57:59 -07:00 |
|
oobabooga
|
85f45cafa1
|
Fix extensions install
|
2023-09-28 13:54:36 -07:00 |
|
Nathan Thomas
|
e145d9a0da
|
Update one_click.py to initialize site_packages_path variable (#4118)
|
2023-09-28 08:31:29 -03:00 |
|
HideLord
|
0845724a89
|
Supercharging superbooga (#3272)
|
2023-09-26 21:30:19 -03:00 |
|
jllllll
|
ad00b8eb26
|
Check '--model-dir' for no models warning (#4067)
|
2023-09-26 10:56:57 -03:00 |
|
oobabooga
|
44438c60e5
|
Add INSTALL_EXTENSIONS environment variable
|
2023-09-25 13:12:35 -07:00 |
|
jllllll
|
c0fca23cb9
|
Avoid importing torch in one-click-installer (#4064)
|
2023-09-24 22:16:59 -03:00 |
|
oobabooga
|
d5952cb540
|
Don't assume that py-cpuinfo is installed
|
2023-09-24 08:10:45 -07:00 |
|
oobabooga
|
2e7b6b0014
|
Create alternative requirements.txt with AMD and Metal wheels (#4052)
|
2023-09-24 09:58:29 -03:00 |
|
oobabooga
|
30d7c4eaa1
|
Forward --help to server.py
|
2023-09-23 07:27:27 -07:00 |
|
oobabooga
|
c2ae01fb04
|
Improved readability
|
2023-09-23 07:10:01 -07:00 |
|
oobabooga
|
fc351ff3e5
|
Improved readability
|
2023-09-23 06:48:09 -07:00 |
|
oobabooga
|
e6f445f3eb
|
Improved readability of one_click.py
|
2023-09-23 06:28:58 -07:00 |
|
oobabooga
|
639723845a
|
Make N the "None" install option
|
2023-09-23 05:25:06 -07:00 |
|
oobabooga
|
0306b61bb0
|
Add IPEX option to the installer (experimental)
|
2023-09-23 05:17:41 -07:00 |
|
mongolu
|
d70b8d9048
|
Added two ENVs in webui.py for docker (#111)
|
2023-09-22 19:04:41 -07:00 |
|
oobabooga
|
c5e0ab7174
|
Minor bug fix
|
2023-09-22 14:50:27 -07:00 |
|
oobabooga
|
a0c7d764b5
|
Fix pytorch installation on Linux
|
2023-09-22 14:34:30 -07:00 |
|
deevis
|
7f0ea4dc16
|
feature: allow comments in CMD_FLAGS.txt (#127)
---------
Co-authored-by: missionfloyd <missionfloyd@users.noreply.github.com>
|
2023-09-22 13:28:24 -07:00 |
|
oobabooga
|
0fee18e8b7
|
Rename some functions
|
2023-09-22 12:08:05 -07:00 |
|
oobabooga
|
6c5f81f002
|
Rename webui.py to one_click.py
|
2023-09-22 12:00:06 -07:00 |
|