diff --git a/Home.md b/Home.md index 5209160..29c2444 100644 --- a/Home.md +++ b/Home.md @@ -2,18 +2,18 @@ This is WIP. Please come back later. ## What works -| Loader | Loading 1 LoRA | Loading 2 or more LoRAs | Training LoRAs | Multimodal extension | Perplexity evaluation | Classifier-Free Guidance (CFG) | -|----------------|----------------|-------------------------|----------------|----------------------|-----------------------|--------------------------------| -| Transformers | ✅ | ❌ | ✅* | ✅ | ✅ | ✅ | -| ExLlama_HF | ✅ | ❌ | ❌ | ❌ | ✅ | ✅ | -| ExLlamav2_HF | ✅ | ✅ | ❌ | ❌ | ✅ | ✅ | -| ExLlama | ✅ | ❌ | ❌ | ❌ | use ExLlama_HF | ✅ | -| ExLlamav2 | ✅ | ✅ | ❌ | ❌ | use ExLlamav2_HF | ❌ | -| AutoGPTQ | ✅ | ❌ | ❌ | ✅ | ✅ | ✅ | -| GPTQ-for-LLaMa | ✅ | ❌ | ✅ | ✅ | ✅ | ✅ | -| llama.cpp | ❌ | ❌ | ❌ | ❌ | use llamacpp_HF | ❌ | -| llamacpp_HF | ❌ | ❌ | ❌ | ❌ | ✅ | ✅ | -| ctransformers | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | +| Loader | Loading 1 LoRA | Loading 2 or more LoRAs | Training LoRAs | Multimodal extension | Perplexity evaluation | +|----------------|----------------|-------------------------|----------------|----------------------|-----------------------| +| Transformers | ✅ | ❌ | ✅* | ✅ | ✅ | +| ExLlama_HF | ✅ | ❌ | ❌ | ❌ | ✅ | +| ExLlamav2_HF | ✅ | ✅ | ❌ | ❌ | ✅ | +| ExLlama | ✅ | ❌ | ❌ | ❌ | use ExLlama_HF | +| ExLlamav2 | ✅ | ✅ | ❌ | ❌ | use ExLlamav2_HF | +| AutoGPTQ | ✅ | ❌ | ❌ | ✅ | ✅ | +| GPTQ-for-LLaMa | ✅ | ❌ | ✅ | ✅ | ✅ | +| llama.cpp | ❌ | ❌ | ❌ | ❌ | use llamacpp_HF | +| llamacpp_HF | ❌ | ❌ | ❌ | ❌ | ✅ | +| ctransformers | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ = not implemented