Commit Graph

2793 Commits

Author SHA1 Message Date
oobabooga
f4f04c8c32 Fix a typo 2023-08-25 07:08:38 -07:00
oobabooga
5c7d8bfdfd Detect CodeLlama settings 2023-08-25 07:06:57 -07:00
oobabooga
52ab2a6b9e Add rope_freq_base parameter for CodeLlama 2023-08-25 06:55:15 -07:00
oobabooga
feecd8190f Unescape inline code blocks 2023-08-24 21:01:09 -07:00
oobabooga
26c5e5e878 Bump autogptq 2023-08-24 19:23:08 -07:00
oobabooga
a2c67262c7 Unescape model output for silero/elevenlabs 2023-08-24 17:27:12 -07:00
oobabooga
3320accfdc
Add CFG to llamacpp_HF (second attempt) (#3678) 2023-08-24 20:32:21 -03:00
oobabooga
d6934bc7bc
Implement CFG for ExLlama_HF (#3666) 2023-08-24 16:27:36 -03:00
oobabooga
2b675533f7 Un-bump safetensors
The newest one doesn't work on Windows yet
2023-08-23 14:36:03 -07:00
oobabooga
87442c6d18 Fix Notebook Logits tab 2023-08-22 21:00:12 -07:00
oobabooga
c0b119c3a3 Improve logit viewer format 2023-08-22 20:35:12 -07:00
oobabooga
6d6f40e8f8 Merge remote-tracking branch 'refs/remotes/origin/main' 2023-08-22 20:18:45 -07:00
oobabooga
8545052c9d Add the option to use samplers in the logit viewer 2023-08-22 20:18:16 -07:00
oobabooga
d7c98fe715
Update stale.yml 2023-08-22 21:48:32 -03:00
Sam
0b352ea7ef
Add missing extensions to Dockerfile (#3544) 2023-08-22 17:41:11 -03:00
oobabooga
25e5eaa6a6 Remove outdated training warning 2023-08-22 13:16:44 -07:00
oobabooga
335c49cc7e Bump peft and transformers 2023-08-22 13:14:59 -07:00
oobabooga
727fd229f4 Increase stalebot timeout to 6 weeks 2023-08-22 13:03:17 -07:00
tkbit
df165fe6c4
Use numpy==1.24 in requirements.txt (#3651)
The whisper extension needs numpy 1.24 to work properly
2023-08-22 16:55:17 -03:00
cal066
e042bf8624
ctransformers: add mlock and no-mmap options (#3649) 2023-08-22 16:51:34 -03:00
tdrussell
2da38e89e6
Fix whitespace formatting in perplexity_colors extension. (#3643) 2023-08-22 16:49:37 -03:00
oobabooga
1b419f656f Acknowledge a16z support 2023-08-21 11:57:51 -07:00
oobabooga
6cca8b8028 Only update notebook token counter on input
For performance during streaming
2023-08-21 05:39:55 -07:00
oobabooga
41b98e07fb Minor CSS fix 2023-08-20 22:09:18 -07:00
oobabooga
2cb07065ec Fix an escaping bug 2023-08-20 21:50:42 -07:00
oobabooga
a74dd9003f Fix HTML escaping for perplexity_colors extension 2023-08-20 21:40:22 -07:00
oobabooga
6394fef1db Rewrite tab detection js 2023-08-20 21:02:53 -07:00
oobabooga
57036abc76 Add "send to default/notebook" buttons to chat tab 2023-08-20 19:54:59 -07:00
oobabooga
429cacd715 Add a token counter similar to automatic1111
It can now be found in the Default and Notebook tabs
2023-08-20 19:37:33 -07:00
oobabooga
120fb86c6a
Add a simple logit viewer (#3636) 2023-08-20 20:49:21 -03:00
SeanScripts
2c1fd0d72b
Add probability dropdown to perplexity_colors extension (#3148) 2023-08-20 20:28:14 -03:00
Thomas De Bonnet
0dfd1a8b7d
Improve readability of download-model.py (#3497) 2023-08-20 20:13:13 -03:00
oobabooga
457fedfa36 Remove niche dockerfile 2023-08-20 16:02:44 -07:00
oobabooga
ef17da70af Fix ExLlama truncation 2023-08-20 08:53:26 -07:00
oobabooga
ee964bcce9 Update a comment about RoPE scaling 2023-08-20 07:01:43 -07:00
missionfloyd
1cae784761
Unescape last message (#3623) 2023-08-19 09:29:08 -03:00
Cebtenzzre
942ad6067d
llama.cpp: make Stop button work with streaming disabled (#3620) 2023-08-19 00:17:27 -03:00
oobabooga
f6724a1a01 Return the visible history with "Copy last reply" 2023-08-18 13:04:45 -07:00
oobabooga
b96fd22a81
Refactor the training tab (#3619) 2023-08-18 16:58:38 -03:00
oobabooga
54df0bfad1 Update README.md 2023-08-18 09:43:15 -07:00
oobabooga
f50f534b0f Add note about AMD/Metal to README 2023-08-18 09:37:20 -07:00
oobabooga
c4733000d7 Return the visible history with "Remove last" 2023-08-18 09:25:51 -07:00
oobabooga
5a6e7057b9 Merge branch 'bump-llamacpp' 2023-08-18 08:05:24 -07:00
jllllll
1a71ab58a9
Bump llama_cpp_python_cuda to 0.1.78 (#3614) 2023-08-18 12:04:01 -03:00
oobabooga
7cba000421
Bump llama-cpp-python, +tensor_split by @shouyiwang, +mul_mat_q (#3610) 2023-08-18 12:03:34 -03:00
oobabooga
d8f660e586 Add to modules/loaders.py 2023-08-18 08:00:22 -07:00
oobabooga
4ec42679e3 Add --mul_mat_q param 2023-08-18 07:58:20 -07:00
oobabooga
28cf5862af Add UI element for tensor_split 2023-08-18 06:26:48 -07:00
missionfloyd
4b69f4f6ae
Fix print CSS (#3608) 2023-08-18 01:44:22 -03:00
oobabooga
6170b5ba31 Bump llama-cpp-python 2023-08-17 21:41:02 -07:00