mirror of
https://github.com/oobabooga/text-generation-webui.git
synced 2024-11-27 01:59:14 +01:00
Use the cuda branch of gptq-for-llama
Did I do this right @jllllll? This is because the current default branch (triton) is not compatible with Windows.
This commit is contained in:
parent
0de4f24b12
commit
78c0da4a18
@ -93,6 +93,7 @@ cd repositories || goto end
|
|||||||
if not exist GPTQ-for-LLaMa\ (
|
if not exist GPTQ-for-LLaMa\ (
|
||||||
git clone https://github.com/qwopqwop200/GPTQ-for-LLaMa.git
|
git clone https://github.com/qwopqwop200/GPTQ-for-LLaMa.git
|
||||||
cd GPTQ-for-LLaMa || goto end
|
cd GPTQ-for-LLaMa || goto end
|
||||||
|
git checkout cuda
|
||||||
call python -m pip install -r requirements.txt
|
call python -m pip install -r requirements.txt
|
||||||
call python setup_cuda.py install
|
call python setup_cuda.py install
|
||||||
if not exist "%INSTALL_ENV_DIR%\lib\site-packages\quant_cuda-0.0.0-py3.10-win-amd64.egg" (
|
if not exist "%INSTALL_ENV_DIR%\lib\site-packages\quant_cuda-0.0.0-py3.10-win-amd64.egg" (
|
||||||
|
Loading…
Reference in New Issue
Block a user