missionfloyd
|
43ec9d1619
|
silero_tts: Add language option (#3878)
|
2023-09-12 15:49:46 -03:00 |
|
oobabooga
|
04a74b3774
|
Update README
|
2023-09-12 10:46:27 -07:00 |
|
oobabooga
|
16e1696071
|
Minor qol change
|
2023-09-12 10:44:26 -07:00 |
|
oobabooga
|
c2a309f56e
|
Add ExLlamaV2 and ExLlamav2_HF loaders (#3881)
|
2023-09-12 14:33:07 -03:00 |
|
oobabooga
|
a821928877
|
Reduce chat width
|
2023-09-12 10:26:43 -07:00 |
|
oobabooga
|
df123a20fc
|
Prevent extra keys from being saved to settings.yaml
|
2023-09-11 20:13:10 -07:00 |
|
oobabooga
|
dae428a967
|
Revamp cai-chat theme, make it default
|
2023-09-11 19:30:40 -07:00 |
|
oobabooga
|
47d1ca467b
|
Pin pandas version in superbooga
|
2023-09-11 18:34:34 -07:00 |
|
oobabooga
|
78811dd89a
|
Fix GGUF metadata reading for falcon
|
2023-09-11 15:49:50 -07:00 |
|
oobabooga
|
9331ab4798
|
Read GGUF metadata (#3873)
|
2023-09-11 18:49:30 -03:00 |
|
oobabooga
|
39f4800d94
|
Merge remote-tracking branch 'refs/remotes/origin/main'
|
2023-09-11 10:45:13 -07:00 |
|
oobabooga
|
5c58dfadef
|
Update requirements_nocuda.txt
|
2023-09-11 10:44:19 -07:00 |
|
Sam
|
fa363da7ce
|
improve docker builds (#3715)
|
2023-09-11 12:22:00 -03:00 |
|
oobabooga
|
df52dab67b
|
Lint
|
2023-09-11 07:57:38 -07:00 |
|
Eve
|
92f3cd624c
|
Improve instructions for CPUs without AVX2 (#3786)
|
2023-09-11 11:54:04 -03:00 |
|
oobabooga
|
ed86878f02
|
Remove GGML support
|
2023-09-11 07:44:00 -07:00 |
|
John Smith
|
cc7b7ba153
|
fix lora training with alpaca_lora_4bit (#3853)
|
2023-09-11 01:22:20 -03:00 |
|
Forkoz
|
15e9b8c915
|
Exllama new rope settings (#3852)
|
2023-09-11 01:14:36 -03:00 |
|
jllllll
|
859b4fd737
|
Bump exllama to 0.1.17 (#3847)
|
2023-09-11 01:13:14 -03:00 |
|
dependabot[bot]
|
1d6b384828
|
Update transformers requirement from ==4.32.* to ==4.33.* (#3865)
|
2023-09-11 01:12:22 -03:00 |
|
jllllll
|
e8f234ca8f
|
Bump llama-cpp-python to 0.1.84 (#3854)
|
2023-09-11 01:11:33 -03:00 |
|
oobabooga
|
66d5caba1b
|
Pin pydantic version (closes #3850)
|
2023-09-10 21:09:04 -07:00 |
|
oobabooga
|
4affa08821
|
Do not impose instruct mode while loading models
|
2023-09-02 11:31:33 -07:00 |
|
oobabooga
|
0576691538
|
Add optimum to requirements (for GPTQ LoRA training)
See https://github.com/oobabooga/text-generation-webui/issues/3655
|
2023-08-31 08:45:38 -07:00 |
|
oobabooga
|
40ffc3d687
|
Update README.md
|
2023-08-30 18:19:04 -03:00 |
|
oobabooga
|
47e490c7b4
|
Set use_cache=True by default for all models
|
2023-08-30 13:26:27 -07:00 |
|
oobabooga
|
5190e153ed
|
Update README.md
|
2023-08-30 14:06:29 -03:00 |
|
jllllll
|
9626f57721
|
Bump exllama to 0.0.14 (#3758)
|
2023-08-30 13:43:38 -03:00 |
|
oobabooga
|
bc4023230b
|
Improved instructions for AMD/Metal/Intel Arc/CPUs without AVCX2
|
2023-08-30 09:40:00 -07:00 |
|
oobabooga
|
b2f7ca0d18
|
Cloudfare fix 2
|
2023-08-29 19:54:43 -07:00 |
|
missionfloyd
|
787219267c
|
Allow downloading single file from UI (#3737)
|
2023-08-29 23:32:36 -03:00 |
|
Alberto Ferrer
|
f63dd83631
|
Update download-model.py (Allow single file download) (#3732)
|
2023-08-29 22:57:58 -03:00 |
|
jllllll
|
dac5f4b912
|
Bump llama-cpp-python to 0.1.83 (#3745)
|
2023-08-29 22:35:59 -03:00 |
|
oobabooga
|
6c16e4cecf
|
Cloudfare fix
Credits: https://github.com/oobabooga/text-generation-webui/issues/1524#issuecomment-1698255209
|
2023-08-29 16:35:44 -07:00 |
|
oobabooga
|
828d97a98c
|
Minor CSS improvement
|
2023-08-29 16:15:12 -07:00 |
|
oobabooga
|
a26c2300cb
|
Make instruct style more readable (attempt)
|
2023-08-29 14:14:01 -07:00 |
|
q5sys (JT)
|
cdb854db9e
|
Update llama.cpp.md instructions (#3702)
|
2023-08-29 17:56:50 -03:00 |
|
VishwasKukreti
|
a9a1784420
|
Update accelerate to 0.22 in requirements.txt (#3725)
|
2023-08-29 17:47:37 -03:00 |
|
oobabooga
|
cec8db52e5
|
Add max_tokens_second param (#3533)
|
2023-08-29 17:44:31 -03:00 |
|
jllllll
|
fe1f7c6513
|
Bump ctransformers to 0.2.25 (#3740)
|
2023-08-29 17:24:36 -03:00 |
|
oobabooga
|
672b610dba
|
Improve tab switching js
|
2023-08-29 13:22:15 -07:00 |
|
oobabooga
|
2b58a89f6a
|
Clear instruction template before loading new one
|
2023-08-29 13:11:32 -07:00 |
|
oobabooga
|
36864cb3e8
|
Use Alpaca as the default instruction template
|
2023-08-29 13:06:25 -07:00 |
|
oobabooga
|
9a202f7fb2
|
Prevent <ul> lists from flickering during streaming
|
2023-08-28 20:45:07 -07:00 |
|
oobabooga
|
8b56fc993a
|
Change lists style in chat mode
|
2023-08-28 20:14:02 -07:00 |
|
oobabooga
|
e8c0c4990d
|
Unescape HTML in the chat API examples
|
2023-08-28 19:42:03 -07:00 |
|
oobabooga
|
439dd0faab
|
Fix stopping strings in the chat API
|
2023-08-28 19:40:11 -07:00 |
|
oobabooga
|
86c45b67ca
|
Merge remote-tracking branch 'refs/remotes/origin/main'
|
2023-08-28 18:29:38 -07:00 |
|
oobabooga
|
c75f98a6d6
|
Autoscroll Notebook/Default textareas during streaming
|
2023-08-28 18:22:03 -07:00 |
|
jllllll
|
22b2a30ec7
|
Bump llama-cpp-python to 0.1.82 (#3730)
|
2023-08-28 18:02:24 -03:00 |
|