oobabooga
|
62d59a516f
|
Add trust_remote_code to all HF loaders
|
2023-12-08 06:29:26 -08:00 |
|
oobabooga
|
181743fd97
|
Fix missing spaces tokenizer issue (closes #4834)
|
2023-12-08 05:16:46 -08:00 |
|
oobabooga
|
00aedf9209
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2023-12-08 05:02:25 -08:00 |
|
oobabooga
|
7bbe7e803a
|
Minor fix
|
2023-12-08 05:01:25 -08:00 |
|
Yiximail
|
1c74b3ab45
|
Fix partial unicode characters issue (#4837)
|
2023-12-08 09:50:53 -03:00 |
|
oobabooga
|
2c5a1e67f9
|
Parameters: change max_new_tokens & repetition_penalty_range defaults (#4842)
|
2023-12-07 20:04:52 -03:00 |
|
Song Fuchang
|
e16e5997ef
|
Update IPEX install URL. (#4825)
* Old pip url no longer works. Use the latest url from
* https://intel.github.io/intel-extension-for-pytorch/index.html#installation
|
2023-12-06 21:07:01 -03:00 |
|
oobabooga
|
d516815c9c
|
Model downloader: download only fp16 if both fp16 and GGUF are present
|
2023-12-05 21:09:12 -08:00 |
|
oobabooga
|
98361af4d5
|
Add QuIP# support (#4803)
It has to be installed manually for now.
|
2023-12-06 00:01:01 -03:00 |
|
oobabooga
|
6430acadde
|
Minor bug fix after https://github.com/oobabooga/text-generation-webui/pull/4814
|
2023-12-05 10:08:11 -08:00 |
|
oobabooga
|
c21a9668a5
|
Lint
|
2023-12-04 21:17:05 -08:00 |
|
erew123
|
f786aa3caa
|
Clean-up Ctrl+C Shutdown (#4802)
|
2023-12-05 02:16:16 -03:00 |
|
oobabooga
|
0f828ea441
|
Do not limit API updates/second
|
2023-12-04 20:45:43 -08:00 |
|
oobabooga
|
9edb193def
|
Optimize HF text generation (#4814)
|
2023-12-05 00:00:40 -03:00 |
|
俞航
|
ac9f154bcc
|
Bump exllamav2 from 0.0.8 to 0.0.10 & Fix code change (#4782)
|
2023-12-04 21:15:05 -03:00 |
|
oobabooga
|
131a5212ce
|
UI: update context upper limit to 200000
|
2023-12-04 15:48:34 -08:00 |
|
oobabooga
|
f7145544f9
|
Update README
|
2023-12-04 15:44:44 -08:00 |
|
oobabooga
|
8e1f86a866
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2023-12-04 15:41:56 -08:00 |
|
oobabooga
|
be88b072e9
|
Update --loader flag description
|
2023-12-04 15:41:25 -08:00 |
|
dependabot[bot]
|
801ba87c68
|
Update accelerate requirement from ==0.24.* to ==0.25.* (#4810)
|
2023-12-04 20:36:01 -03:00 |
|
oobabooga
|
7fc9033b2e
|
Recommend ExLlama_HF and ExLlamav2_HF
|
2023-12-04 15:28:46 -08:00 |
|
oobabooga
|
3f993280e4
|
Minor changes
|
2023-12-04 07:27:44 -08:00 |
|
oobabooga
|
0931ed501b
|
Minor changes
|
2023-12-04 07:25:18 -08:00 |
|
oobabooga
|
427a165597
|
Bump TTS version in coqui_tts
|
2023-12-04 07:21:56 -08:00 |
|
Song Fuchang
|
0bfd5090be
|
Import accelerate very early to make Intel GPU happy (#4704)
|
2023-12-03 22:51:18 -03:00 |
|
dependabot[bot]
|
2e83844f35
|
Bump safetensors from 0.4.0 to 0.4.1 (#4750)
|
2023-12-03 22:50:10 -03:00 |
|
Ikko Eltociear Ashimine
|
06cc9a85f7
|
README: minor typo fix (#4793)
|
2023-12-03 22:46:34 -03:00 |
|
Lounger
|
7c0a17962d
|
Gallery improvements (#4789)
|
2023-12-03 22:45:50 -03:00 |
|
oobabooga
|
77d6ccf12b
|
Add a LOADER debug message while loading models
|
2023-11-30 12:00:32 -08:00 |
|
oobabooga
|
1c90e02243
|
Update Colab-TextGen-GPU.ipynb
|
2023-11-30 11:55:18 -08:00 |
|
oobabooga
|
092a2c3516
|
Fix a bug in llama.cpp get_logits() function
|
2023-11-30 11:21:40 -08:00 |
|
oobabooga
|
000b77a17d
|
Minor docker changes
|
2023-11-29 21:27:23 -08:00 |
|
Callum
|
88620c6b39
|
feature/docker_improvements (#4768)
|
2023-11-30 02:20:23 -03:00 |
|
oobabooga
|
2698d7c9fd
|
Fix llama.cpp model unloading
|
2023-11-29 15:19:48 -08:00 |
|
oobabooga
|
fa89d305e3
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2023-11-29 15:13:17 -08:00 |
|
oobabooga
|
9940ed9c77
|
Sort the loaders
|
2023-11-29 15:13:03 -08:00 |
|
Manu Kashyap
|
78fd7f6aa8
|
Fixed naming for sentence-transformers library (#4764)
|
2023-11-29 12:15:03 -03:00 |
|
oobabooga
|
a7670c31ca
|
Sort
|
2023-11-28 18:43:33 -08:00 |
|
oobabooga
|
6e51bae2e0
|
Sort the loaders menu
|
2023-11-28 18:41:11 -08:00 |
|
oobabooga
|
f4b956b47c
|
Detect yi instruction template
|
2023-11-27 10:45:47 -08:00 |
|
oobabooga
|
68059d7c23
|
llama.cpp: minor log change & lint
|
2023-11-27 10:44:55 -08:00 |
|
Denis Iskandarov
|
1b05832f9a
|
Add direnv artifacts to gitignore (#4737)
|
2023-11-27 15:43:42 -03:00 |
|
xr4dsh
|
b5b3d18773
|
resonable cli args for docker container (#4727)
|
2023-11-27 15:43:01 -03:00 |
|
tsukanov-as
|
9f7ae6bb2e
|
fix detection of stopping strings when HTML escaping is used (#4728)
|
2023-11-27 15:42:08 -03:00 |
|
Eve
|
d06ce7b75c
|
add openhermes mistral support (#4730)
|
2023-11-27 15:41:06 -03:00 |
|
oobabooga
|
b6d16a35b1
|
Minor API fix
|
2023-11-21 17:56:28 -08:00 |
|
oobabooga
|
cb0dbffccc
|
Merge branch 'main' into dev
|
2023-11-21 16:12:45 -08:00 |
|
oobabooga
|
8d811a4d58
|
one-click: move on instead of crashing if extension fails to install
|
2023-11-21 16:09:44 -08:00 |
|
oobabooga
|
0589ff5b12
|
Bump llama-cpp-python to 0.2.19 & add min_p and typical_p parameters to llama.cpp loader (#4701)
|
2023-11-21 20:59:39 -03:00 |
|
oobabooga
|
2769a1fa25
|
Hide deprecated args from Session tab
|
2023-11-21 15:15:16 -08:00 |
|