diff --git a/Home.md b/Home.md index 97bc05a..288da24 100644 --- a/Home.md +++ b/Home.md @@ -17,10 +17,10 @@ | llamacpp_HF | ❌ | ❌ | ❌ | ✅ | ✅ | | ctransformers | ❌ | ❌ | ❌ | ❌ | ❌ | -❌ = not implemented - ✅ = implemented +❌ = not implemented + \* For training LoRAs with GPTQ models, use this loader with the options `auto_devices` and `disable_exllama` checked. \*\* Needs the monkey-patch.