Commit Graph

17 Commits

Author SHA1 Message Date
oobabooga
03dc69edc5 ExLlama_HF (v1 and v2) prefix matching 2023-09-19 13:12:19 -07:00
oobabooga
df52dab67b Lint 2023-09-11 07:57:38 -07:00
Forkoz
15e9b8c915
Exllama new rope settings (#3852) 2023-09-11 01:14:36 -03:00
oobabooga
52ab2a6b9e Add rope_freq_base parameter for CodeLlama 2023-08-25 06:55:15 -07:00
oobabooga
d6934bc7bc
Implement CFG for ExLlama_HF (#3666) 2023-08-24 16:27:36 -03:00
oobabooga
0af10ab49b
Add Classifier Free Guidance (CFG) for Transformers/ExLlama (#3325) 2023-08-06 17:22:48 -03:00
oobabooga
94dfcec237
Make it possible to evaluate exllama perplexity (#3138) 2023-07-16 01:52:55 -03:00
Panchovix
10c8c197bf
Add Support for Static NTK RoPE scaling for exllama/exllama_hf (#2955) 2023-07-04 01:13:16 -03:00
ardfork
3c076c3c80
Disable half2 for ExLlama when using HIP (#2912) 2023-06-29 15:03:16 -03:00
oobabooga
20740ab16e Revert "Fix exllama_hf gibbersh above 2048 context, and works >5000 context. (#2913)"
This reverts commit 37a16d23a7.
2023-06-28 18:10:34 -03:00
Panchovix
37a16d23a7
Fix exllama_hf gibbersh above 2048 context, and works >5000 context. (#2913) 2023-06-28 12:36:07 -03:00
oobabooga
22d455b072 Add LoRA support to ExLlama_HF 2023-06-26 00:10:33 -03:00
oobabooga
c52290de50
ExLlama with long context (#2875) 2023-06-25 22:49:26 -03:00
jllllll
bef67af23c
Use pre-compiled python module for ExLlama (#2770) 2023-06-24 20:24:17 -03:00
oobabooga
cec5fb0ef6 Failed attempt at evaluating exllama_hf perplexity 2023-06-24 12:02:25 -03:00
Panchovix
b4a38c24b7
Fix Multi-GPU not working on exllama_hf (#2803) 2023-06-22 16:05:25 -03:00
LarryVRH
580c1ee748
Implement a demo HF wrapper for exllama to utilize existing HF transformers decoding. (#2777) 2023-06-21 15:31:42 -03:00