A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
Go to file
oobabooga 78c0da4a18
Use the cuda branch of gptq-for-llama
Did I do this right @jllllll? This is because the current default branch (triton) is not compatible with Windows.
2023-03-30 18:04:05 -03:00
download-model.bat Changed things around to allow Micromamba to work with paths containing spaces. 2023-03-25 01:26:25 -05:00
install.bat Use the cuda branch of gptq-for-llama 2023-03-30 18:04:05 -03:00
INSTRUCTIONS.txt Update INSTRUCTIONS.txt 2023-03-24 18:28:46 -05:00
micromamba-cmd.bat Changed things around to allow Micromamba to work with paths containing spaces. 2023-03-25 01:26:25 -05:00
start-webui.bat Changed things around to allow Micromamba to work with paths containing spaces. 2023-03-25 01:26:25 -05:00