Commit Graph

2439 Commits

Author SHA1 Message Date
oobabooga
d826bc5d1b
Merge pull request #3697 from jllllll/llamacpp-ggml
Use separate llama-cpp-python packages for GGML support
2023-08-27 01:51:00 -03:00
jllllll
4d61a7d9da
Account for deprecated GGML parameters 2023-08-26 14:07:46 -05:00
jllllll
4a999e3bcd
Use separate llama-cpp-python packages for GGML support 2023-08-26 10:40:08 -05:00
oobabooga
6e6431e73f Update requirements.txt 2023-08-26 01:07:28 -07:00
oobabooga
83640d6f43 Replace ggml occurences with gguf 2023-08-26 01:06:59 -07:00
oobabooga
1a642c12b5 Fix silero_tts HTML unescaping 2023-08-26 00:45:07 -07:00
jllllll
db42b365c9
Fix ctransformers threads auto-detection (#3688) 2023-08-25 14:37:02 -03:00
oobabooga
0bcecaa216 Set mode: instruct for CodeLlama-instruct 2023-08-25 07:59:23 -07:00
cal066
960980247f
ctransformers: gguf support (#3685) 2023-08-25 11:33:04 -03:00
oobabooga
21058c37f7 Add missing file 2023-08-25 07:10:26 -07:00
oobabooga
f4f04c8c32 Fix a typo 2023-08-25 07:08:38 -07:00
oobabooga
5c7d8bfdfd Detect CodeLlama settings 2023-08-25 07:06:57 -07:00
oobabooga
52ab2a6b9e Add rope_freq_base parameter for CodeLlama 2023-08-25 06:55:15 -07:00
oobabooga
feecd8190f Unescape inline code blocks 2023-08-24 21:01:09 -07:00
oobabooga
26c5e5e878 Bump autogptq 2023-08-24 19:23:08 -07:00
oobabooga
a2c67262c7 Unescape model output for silero/elevenlabs 2023-08-24 17:27:12 -07:00
oobabooga
3320accfdc
Add CFG to llamacpp_HF (second attempt) (#3678) 2023-08-24 20:32:21 -03:00
oobabooga
d6934bc7bc
Implement CFG for ExLlama_HF (#3666) 2023-08-24 16:27:36 -03:00
oobabooga
2b675533f7 Un-bump safetensors
The newest one doesn't work on Windows yet
2023-08-23 14:36:03 -07:00
oobabooga
87442c6d18 Fix Notebook Logits tab 2023-08-22 21:00:12 -07:00
oobabooga
c0b119c3a3 Improve logit viewer format 2023-08-22 20:35:12 -07:00
oobabooga
6d6f40e8f8 Merge remote-tracking branch 'refs/remotes/origin/main' 2023-08-22 20:18:45 -07:00
oobabooga
8545052c9d Add the option to use samplers in the logit viewer 2023-08-22 20:18:16 -07:00
Sam
0b352ea7ef
Add missing extensions to Dockerfile (#3544) 2023-08-22 17:41:11 -03:00
oobabooga
25e5eaa6a6 Remove outdated training warning 2023-08-22 13:16:44 -07:00
oobabooga
335c49cc7e Bump peft and transformers 2023-08-22 13:14:59 -07:00
oobabooga
727fd229f4 Increase stalebot timeout to 6 weeks 2023-08-22 13:03:17 -07:00
tkbit
df165fe6c4
Use numpy==1.24 in requirements.txt (#3651)
The whisper extension needs numpy 1.24 to work properly
2023-08-22 16:55:17 -03:00
cal066
e042bf8624
ctransformers: add mlock and no-mmap options (#3649) 2023-08-22 16:51:34 -03:00
tdrussell
2da38e89e6
Fix whitespace formatting in perplexity_colors extension. (#3643) 2023-08-22 16:49:37 -03:00
oobabooga
1b419f656f Acknowledge a16z support 2023-08-21 11:57:51 -07:00
oobabooga
6cca8b8028 Only update notebook token counter on input
For performance during streaming
2023-08-21 05:39:55 -07:00
oobabooga
41b98e07fb Minor CSS fix 2023-08-20 22:09:18 -07:00
oobabooga
2cb07065ec Fix an escaping bug 2023-08-20 21:50:42 -07:00
oobabooga
a74dd9003f Fix HTML escaping for perplexity_colors extension 2023-08-20 21:40:22 -07:00
oobabooga
6394fef1db Rewrite tab detection js 2023-08-20 21:02:53 -07:00
oobabooga
57036abc76 Add "send to default/notebook" buttons to chat tab 2023-08-20 19:54:59 -07:00
oobabooga
429cacd715 Add a token counter similar to automatic1111
It can now be found in the Default and Notebook tabs
2023-08-20 19:37:33 -07:00
oobabooga
120fb86c6a
Add a simple logit viewer (#3636) 2023-08-20 20:49:21 -03:00
SeanScripts
2c1fd0d72b
Add probability dropdown to perplexity_colors extension (#3148) 2023-08-20 20:28:14 -03:00
Thomas De Bonnet
0dfd1a8b7d
Improve readability of download-model.py (#3497) 2023-08-20 20:13:13 -03:00
oobabooga
457fedfa36 Remove niche dockerfile 2023-08-20 16:02:44 -07:00
oobabooga
ef17da70af Fix ExLlama truncation 2023-08-20 08:53:26 -07:00
oobabooga
ee964bcce9 Update a comment about RoPE scaling 2023-08-20 07:01:43 -07:00
missionfloyd
1cae784761
Unescape last message (#3623) 2023-08-19 09:29:08 -03:00
Cebtenzzre
942ad6067d
llama.cpp: make Stop button work with streaming disabled (#3620) 2023-08-19 00:17:27 -03:00
oobabooga
f6724a1a01 Return the visible history with "Copy last reply" 2023-08-18 13:04:45 -07:00
oobabooga
b96fd22a81
Refactor the training tab (#3619) 2023-08-18 16:58:38 -03:00
oobabooga
54df0bfad1 Update README.md 2023-08-18 09:43:15 -07:00
oobabooga
f50f534b0f Add note about AMD/Metal to README 2023-08-18 09:37:20 -07:00