Updated Home (markdown)

oobabooga 2023-10-21 18:18:42 -03:00
parent 35aaeb645a
commit c92a712ca0

@ -18,4 +18,5 @@
✅ = implemented
\* Training LoRAs with GPTQ models also works with the Transformers loader. Make sure to check "auto-devices" and "disable_exllama" before loading the model.
\*\* Requires the monkey-patch. The instructions can be found [here](https://github.com/oobabooga/text-generation-webui/wiki/08-%E2%80%90-Additional-Tips#using-loras-with-gptq-for-llama).