From ae7c14cc395b2b308a986180474010cf102db79b Mon Sep 17 00:00:00 2001 From: oobabooga <112222186+oobabooga@users.noreply.github.com> Date: Sat, 21 Oct 2023 13:16:45 -0300 Subject: [PATCH] Updated Home (markdown) --- Home.md | 24 ++++++++++++------------ 1 file changed, 12 insertions(+), 12 deletions(-) diff --git a/Home.md b/Home.md index 5209160..29c2444 100644 --- a/Home.md +++ b/Home.md @@ -2,18 +2,18 @@ This is WIP. Please come back later. ## What works -| Loader | Loading 1 LoRA | Loading 2 or more LoRAs | Training LoRAs | Multimodal extension | Perplexity evaluation | Classifier-Free Guidance (CFG) | -|----------------|----------------|-------------------------|----------------|----------------------|-----------------------|--------------------------------| -| Transformers | ✅ | ❌ | ✅* | ✅ | ✅ | ✅ | -| ExLlama_HF | ✅ | ❌ | ❌ | ❌ | ✅ | ✅ | -| ExLlamav2_HF | ✅ | ✅ | ❌ | ❌ | ✅ | ✅ | -| ExLlama | ✅ | ❌ | ❌ | ❌ | use ExLlama_HF | ✅ | -| ExLlamav2 | ✅ | ✅ | ❌ | ❌ | use ExLlamav2_HF | ❌ | -| AutoGPTQ | ✅ | ❌ | ❌ | ✅ | ✅ | ✅ | -| GPTQ-for-LLaMa | ✅ | ❌ | ✅ | ✅ | ✅ | ✅ | -| llama.cpp | ❌ | ❌ | ❌ | ❌ | use llamacpp_HF | ❌ | -| llamacpp_HF | ❌ | ❌ | ❌ | ❌ | ✅ | ✅ | -| ctransformers | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | +| Loader | Loading 1 LoRA | Loading 2 or more LoRAs | Training LoRAs | Multimodal extension | Perplexity evaluation | +|----------------|----------------|-------------------------|----------------|----------------------|-----------------------| +| Transformers | ✅ | ❌ | ✅* | ✅ | ✅ | +| ExLlama_HF | ✅ | ❌ | ❌ | ❌ | ✅ | +| ExLlamav2_HF | ✅ | ✅ | ❌ | ❌ | ✅ | +| ExLlama | ✅ | ❌ | ❌ | ❌ | use ExLlama_HF | +| ExLlamav2 | ✅ | ✅ | ❌ | ❌ | use ExLlamav2_HF | +| AutoGPTQ | ✅ | ❌ | ❌ | ✅ | ✅ | +| GPTQ-for-LLaMa | ✅ | ❌ | ✅ | ✅ | ✅ | +| llama.cpp | ❌ | ❌ | ❌ | ❌ | use llamacpp_HF | +| llamacpp_HF | ❌ | ❌ | ❌ | ❌ | ✅ | +| ctransformers | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ = not implemented