From 8ac0c1560ee69e4f0dd079b0c08ae969b82cda6a Mon Sep 17 00:00:00 2001 From: oobabooga <112222186+oobabooga@users.noreply.github.com> Date: Sun, 22 Oct 2023 16:07:22 -0300 Subject: [PATCH] Updated Home (markdown) --- Home.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/Home.md b/Home.md index 225a057..1ea463b 100644 --- a/Home.md +++ b/Home.md @@ -2,13 +2,13 @@ | Loader | Loading 1 LoRA | Loading 2 or more LoRAs | Training LoRAs | Multimodal extension | Perplexity evaluation | |----------------|----------------|-------------------------|----------------|----------------------|-----------------------| -| Transformers | ✅ | ❌ | ✅* | ✅ | ✅ | +| Transformers | ✅ | ✅ | ✅* | ✅ | ✅ | | ExLlama_HF | ✅ | ❌ | ❌ | ❌ | ✅ | | ExLlamav2_HF | ✅ | ✅ | ❌ | ❌ | ✅ | | ExLlama | ✅ | ❌ | ❌ | ❌ | use ExLlama_HF | | ExLlamav2 | ✅ | ✅ | ❌ | ❌ | use ExLlamav2_HF | | AutoGPTQ | ✅ | ❌ | ❌ | ✅ | ✅ | -| GPTQ-for-LLaMa | ✅** | ❌ | ✅ | ✅ | ✅ | +| GPTQ-for-LLaMa | ✅** | ✅ | ✅ | ✅ | ✅ | | llama.cpp | ❌ | ❌ | ❌ | ❌ | use llamacpp_HF | | llamacpp_HF | ❌ | ❌ | ❌ | ❌ | ✅ | | ctransformers | ❌ | ❌ | ❌ | ❌ | ❌ |