Commit Graph

2494 Commits

Author SHA1 Message Date
oobabooga
9331ab4798
Read GGUF metadata (#3873) 2023-09-11 18:49:30 -03:00
oobabooga
39f4800d94 Merge remote-tracking branch 'refs/remotes/origin/main' 2023-09-11 10:45:13 -07:00
oobabooga
5c58dfadef Update requirements_nocuda.txt 2023-09-11 10:44:19 -07:00
Sam
fa363da7ce
improve docker builds (#3715) 2023-09-11 12:22:00 -03:00
oobabooga
df52dab67b Lint 2023-09-11 07:57:38 -07:00
Eve
92f3cd624c
Improve instructions for CPUs without AVX2 (#3786) 2023-09-11 11:54:04 -03:00
oobabooga
ed86878f02 Remove GGML support 2023-09-11 07:44:00 -07:00
John Smith
cc7b7ba153
fix lora training with alpaca_lora_4bit (#3853) 2023-09-11 01:22:20 -03:00
Forkoz
15e9b8c915
Exllama new rope settings (#3852) 2023-09-11 01:14:36 -03:00
jllllll
859b4fd737
Bump exllama to 0.1.17 (#3847) 2023-09-11 01:13:14 -03:00
dependabot[bot]
1d6b384828
Update transformers requirement from ==4.32.* to ==4.33.* (#3865) 2023-09-11 01:12:22 -03:00
jllllll
e8f234ca8f
Bump llama-cpp-python to 0.1.84 (#3854) 2023-09-11 01:11:33 -03:00
oobabooga
66d5caba1b Pin pydantic version (closes #3850) 2023-09-10 21:09:04 -07:00
oobabooga
4affa08821 Do not impose instruct mode while loading models 2023-09-02 11:31:33 -07:00
oobabooga
0576691538 Add optimum to requirements (for GPTQ LoRA training)
See https://github.com/oobabooga/text-generation-webui/issues/3655
2023-08-31 08:45:38 -07:00
oobabooga
40ffc3d687
Update README.md 2023-08-30 18:19:04 -03:00
oobabooga
47e490c7b4 Set use_cache=True by default for all models 2023-08-30 13:26:27 -07:00
oobabooga
5190e153ed
Update README.md 2023-08-30 14:06:29 -03:00
jllllll
9626f57721
Bump exllama to 0.0.14 (#3758) 2023-08-30 13:43:38 -03:00
oobabooga
bc4023230b Improved instructions for AMD/Metal/Intel Arc/CPUs without AVCX2 2023-08-30 09:40:00 -07:00
oobabooga
b2f7ca0d18 Cloudfare fix 2 2023-08-29 19:54:43 -07:00
missionfloyd
787219267c
Allow downloading single file from UI (#3737) 2023-08-29 23:32:36 -03:00
Alberto Ferrer
f63dd83631
Update download-model.py (Allow single file download) (#3732) 2023-08-29 22:57:58 -03:00
jllllll
dac5f4b912
Bump llama-cpp-python to 0.1.83 (#3745) 2023-08-29 22:35:59 -03:00
oobabooga
6c16e4cecf Cloudfare fix
Credits: https://github.com/oobabooga/text-generation-webui/issues/1524#issuecomment-1698255209
2023-08-29 16:35:44 -07:00
oobabooga
828d97a98c Minor CSS improvement 2023-08-29 16:15:12 -07:00
oobabooga
a26c2300cb Make instruct style more readable (attempt) 2023-08-29 14:14:01 -07:00
q5sys (JT)
cdb854db9e
Update llama.cpp.md instructions (#3702) 2023-08-29 17:56:50 -03:00
VishwasKukreti
a9a1784420
Update accelerate to 0.22 in requirements.txt (#3725) 2023-08-29 17:47:37 -03:00
oobabooga
cec8db52e5
Add max_tokens_second param (#3533) 2023-08-29 17:44:31 -03:00
jllllll
fe1f7c6513
Bump ctransformers to 0.2.25 (#3740) 2023-08-29 17:24:36 -03:00
oobabooga
672b610dba Improve tab switching js 2023-08-29 13:22:15 -07:00
oobabooga
2b58a89f6a Clear instruction template before loading new one 2023-08-29 13:11:32 -07:00
oobabooga
36864cb3e8 Use Alpaca as the default instruction template 2023-08-29 13:06:25 -07:00
oobabooga
9a202f7fb2 Prevent <ul> lists from flickering during streaming 2023-08-28 20:45:07 -07:00
oobabooga
8b56fc993a Change lists style in chat mode 2023-08-28 20:14:02 -07:00
oobabooga
e8c0c4990d Unescape HTML in the chat API examples 2023-08-28 19:42:03 -07:00
oobabooga
439dd0faab Fix stopping strings in the chat API 2023-08-28 19:40:11 -07:00
oobabooga
86c45b67ca Merge remote-tracking branch 'refs/remotes/origin/main' 2023-08-28 18:29:38 -07:00
oobabooga
c75f98a6d6 Autoscroll Notebook/Default textareas during streaming 2023-08-28 18:22:03 -07:00
jllllll
22b2a30ec7
Bump llama-cpp-python to 0.1.82 (#3730) 2023-08-28 18:02:24 -03:00
oobabooga
558e918fd6 Add a typing dots (...) animation to chat tab 2023-08-28 13:50:36 -07:00
oobabooga
57e9ded00c
Make it possible to scroll during streaming (#3721) 2023-08-28 16:03:20 -03:00
jllllll
7d3a0b5387
Bump llama-cpp-python to 0.1.81 (#3716) 2023-08-27 22:38:41 -03:00
oobabooga
fdef0e4efa Focus on chat input field after Ctrl+S 2023-08-27 16:45:37 -07:00
Cebtenzzre
2f5d769a8d
accept floating-point alpha value on the command line (#3712) 2023-08-27 18:54:43 -03:00
oobabooga
0986868b1b Fix chat scrolling with Dark Reader extension 2023-08-27 14:53:42 -07:00
oobabooga
b2296dcda0 Ctrl+S to show/hide chat controls 2023-08-27 13:14:33 -07:00
Kelvie Wong
a965a36803
Add ffmpeg to the Docker image (#3664) 2023-08-27 12:29:00 -03:00
Ravindra Marella
e4c3e1bdd2
Fix ctransformers model unload (#3711)
Add missing comma in model types list

Fixes marella/ctransformers#111
2023-08-27 10:53:48 -03:00