oobabooga
|
c75f98a6d6
|
Autoscroll Notebook/Default textareas during streaming
|
2023-08-28 18:22:03 -07:00 |
|
jllllll
|
22b2a30ec7
|
Bump llama-cpp-python to 0.1.82 (#3730)
|
2023-08-28 18:02:24 -03:00 |
|
oobabooga
|
558e918fd6
|
Add a typing dots (...) animation to chat tab
|
2023-08-28 13:50:36 -07:00 |
|
oobabooga
|
57e9ded00c
|
Make it possible to scroll during streaming (#3721)
|
2023-08-28 16:03:20 -03:00 |
|
jllllll
|
7d3a0b5387
|
Bump llama-cpp-python to 0.1.81 (#3716)
|
2023-08-27 22:38:41 -03:00 |
|
oobabooga
|
fdef0e4efa
|
Focus on chat input field after Ctrl+S
|
2023-08-27 16:45:37 -07:00 |
|
Cebtenzzre
|
2f5d769a8d
|
accept floating-point alpha value on the command line (#3712)
|
2023-08-27 18:54:43 -03:00 |
|
oobabooga
|
0986868b1b
|
Fix chat scrolling with Dark Reader extension
|
2023-08-27 14:53:42 -07:00 |
|
oobabooga
|
b2296dcda0
|
Ctrl+S to show/hide chat controls
|
2023-08-27 13:14:33 -07:00 |
|
Kelvie Wong
|
a965a36803
|
Add ffmpeg to the Docker image (#3664)
|
2023-08-27 12:29:00 -03:00 |
|
Ravindra Marella
|
e4c3e1bdd2
|
Fix ctransformers model unload (#3711)
Add missing comma in model types list
Fixes marella/ctransformers#111
|
2023-08-27 10:53:48 -03:00 |
|
oobabooga
|
0c9e818bb8
|
Update truncation length based on max_seq_len/n_ctx
|
2023-08-26 23:10:45 -07:00 |
|
oobabooga
|
e6eda5c2da
|
Merge pull request #3695 from oobabooga/gguf2
GGUF
|
2023-08-27 02:33:26 -03:00 |
|
oobabooga
|
3361728da1
|
Change some comments
|
2023-08-26 22:24:44 -07:00 |
|
oobabooga
|
8aeae3b3f4
|
Fix llamacpp_HF loading
|
2023-08-26 22:15:06 -07:00 |
|
oobabooga
|
7f5370a272
|
Minor fixes/cosmetics
|
2023-08-26 22:11:07 -07:00 |
|
oobabooga
|
d826bc5d1b
|
Merge pull request #3697 from jllllll/llamacpp-ggml
Use separate llama-cpp-python packages for GGML support
|
2023-08-27 01:51:00 -03:00 |
|
jllllll
|
4d61a7d9da
|
Account for deprecated GGML parameters
|
2023-08-26 14:07:46 -05:00 |
|
jllllll
|
4a999e3bcd
|
Use separate llama-cpp-python packages for GGML support
|
2023-08-26 10:40:08 -05:00 |
|
oobabooga
|
6e6431e73f
|
Update requirements.txt
|
2023-08-26 01:07:28 -07:00 |
|
oobabooga
|
83640d6f43
|
Replace ggml occurences with gguf
|
2023-08-26 01:06:59 -07:00 |
|
oobabooga
|
1a642c12b5
|
Fix silero_tts HTML unescaping
|
2023-08-26 00:45:07 -07:00 |
|
jllllll
|
db42b365c9
|
Fix ctransformers threads auto-detection (#3688)
|
2023-08-25 14:37:02 -03:00 |
|
oobabooga
|
0bcecaa216
|
Set mode: instruct for CodeLlama-instruct
|
2023-08-25 07:59:23 -07:00 |
|
cal066
|
960980247f
|
ctransformers: gguf support (#3685)
|
2023-08-25 11:33:04 -03:00 |
|
oobabooga
|
21058c37f7
|
Add missing file
|
2023-08-25 07:10:26 -07:00 |
|
oobabooga
|
f4f04c8c32
|
Fix a typo
|
2023-08-25 07:08:38 -07:00 |
|
oobabooga
|
5c7d8bfdfd
|
Detect CodeLlama settings
|
2023-08-25 07:06:57 -07:00 |
|
oobabooga
|
52ab2a6b9e
|
Add rope_freq_base parameter for CodeLlama
|
2023-08-25 06:55:15 -07:00 |
|
oobabooga
|
feecd8190f
|
Unescape inline code blocks
|
2023-08-24 21:01:09 -07:00 |
|
oobabooga
|
26c5e5e878
|
Bump autogptq
|
2023-08-24 19:23:08 -07:00 |
|
oobabooga
|
a2c67262c7
|
Unescape model output for silero/elevenlabs
|
2023-08-24 17:27:12 -07:00 |
|
oobabooga
|
3320accfdc
|
Add CFG to llamacpp_HF (second attempt) (#3678)
|
2023-08-24 20:32:21 -03:00 |
|
oobabooga
|
d6934bc7bc
|
Implement CFG for ExLlama_HF (#3666)
|
2023-08-24 16:27:36 -03:00 |
|
oobabooga
|
2b675533f7
|
Un-bump safetensors
The newest one doesn't work on Windows yet
|
2023-08-23 14:36:03 -07:00 |
|
oobabooga
|
87442c6d18
|
Fix Notebook Logits tab
|
2023-08-22 21:00:12 -07:00 |
|
oobabooga
|
c0b119c3a3
|
Improve logit viewer format
|
2023-08-22 20:35:12 -07:00 |
|
oobabooga
|
6d6f40e8f8
|
Merge remote-tracking branch 'refs/remotes/origin/main'
|
2023-08-22 20:18:45 -07:00 |
|
oobabooga
|
8545052c9d
|
Add the option to use samplers in the logit viewer
|
2023-08-22 20:18:16 -07:00 |
|
oobabooga
|
d7c98fe715
|
Update stale.yml
|
2023-08-22 21:48:32 -03:00 |
|
Sam
|
0b352ea7ef
|
Add missing extensions to Dockerfile (#3544)
|
2023-08-22 17:41:11 -03:00 |
|
oobabooga
|
25e5eaa6a6
|
Remove outdated training warning
|
2023-08-22 13:16:44 -07:00 |
|
oobabooga
|
335c49cc7e
|
Bump peft and transformers
|
2023-08-22 13:14:59 -07:00 |
|
oobabooga
|
727fd229f4
|
Increase stalebot timeout to 6 weeks
|
2023-08-22 13:03:17 -07:00 |
|
tkbit
|
df165fe6c4
|
Use numpy==1.24 in requirements.txt (#3651)
The whisper extension needs numpy 1.24 to work properly
|
2023-08-22 16:55:17 -03:00 |
|
cal066
|
e042bf8624
|
ctransformers: add mlock and no-mmap options (#3649)
|
2023-08-22 16:51:34 -03:00 |
|
tdrussell
|
2da38e89e6
|
Fix whitespace formatting in perplexity_colors extension. (#3643)
|
2023-08-22 16:49:37 -03:00 |
|
oobabooga
|
1b419f656f
|
Acknowledge a16z support
|
2023-08-21 11:57:51 -07:00 |
|
oobabooga
|
6cca8b8028
|
Only update notebook token counter on input
For performance during streaming
|
2023-08-21 05:39:55 -07:00 |
|
oobabooga
|
41b98e07fb
|
Minor CSS fix
|
2023-08-20 22:09:18 -07:00 |
|