Commit Graph

3242 Commits

Author SHA1 Message Date
oobabooga
427a165597 Bump TTS version in coqui_tts 2023-12-04 07:21:56 -08:00
Song Fuchang
0bfd5090be
Import accelerate very early to make Intel GPU happy (#4704) 2023-12-03 22:51:18 -03:00
dependabot[bot]
2e83844f35
Bump safetensors from 0.4.0 to 0.4.1 (#4750) 2023-12-03 22:50:10 -03:00
Ikko Eltociear Ashimine
06cc9a85f7
README: minor typo fix (#4793) 2023-12-03 22:46:34 -03:00
Lounger
7c0a17962d
Gallery improvements (#4789) 2023-12-03 22:45:50 -03:00
oobabooga
96df4f10b9
Merge pull request #4777 from oobabooga/dev
Merge dev branch
2023-12-01 00:00:17 -03:00
oobabooga
77d6ccf12b Add a LOADER debug message while loading models 2023-11-30 12:00:32 -08:00
oobabooga
1c90e02243 Update Colab-TextGen-GPU.ipynb 2023-11-30 11:55:18 -08:00
oobabooga
092a2c3516 Fix a bug in llama.cpp get_logits() function 2023-11-30 11:21:40 -08:00
oobabooga
6d3a9b8689
Merge pull request #4773 from oobabooga/dev
Merge dev branch
2023-11-30 02:31:37 -03:00
oobabooga
000b77a17d Minor docker changes 2023-11-29 21:27:23 -08:00
Callum
88620c6b39
feature/docker_improvements (#4768) 2023-11-30 02:20:23 -03:00
oobabooga
2698d7c9fd Fix llama.cpp model unloading 2023-11-29 15:19:48 -08:00
oobabooga
fa89d305e3 Merge remote-tracking branch 'refs/remotes/origin/dev' into dev 2023-11-29 15:13:17 -08:00
oobabooga
9940ed9c77 Sort the loaders 2023-11-29 15:13:03 -08:00
Manu Kashyap
78fd7f6aa8
Fixed naming for sentence-transformers library (#4764) 2023-11-29 12:15:03 -03:00
oobabooga
a7670c31ca Sort 2023-11-28 18:43:33 -08:00
oobabooga
6e51bae2e0 Sort the loaders menu 2023-11-28 18:41:11 -08:00
oobabooga
f4b956b47c Detect yi instruction template 2023-11-27 10:45:47 -08:00
oobabooga
68059d7c23 llama.cpp: minor log change & lint 2023-11-27 10:44:55 -08:00
Denis Iskandarov
1b05832f9a
Add direnv artifacts to gitignore (#4737) 2023-11-27 15:43:42 -03:00
xr4dsh
b5b3d18773
resonable cli args for docker container (#4727) 2023-11-27 15:43:01 -03:00
tsukanov-as
9f7ae6bb2e
fix detection of stopping strings when HTML escaping is used (#4728) 2023-11-27 15:42:08 -03:00
Eve
d06ce7b75c
add openhermes mistral support (#4730) 2023-11-27 15:41:06 -03:00
oobabooga
b6d16a35b1 Minor API fix 2023-11-21 17:56:28 -08:00
oobabooga
51add248c8
Merge pull request #4702 from oobabooga/dev
Merge dev branch
2023-11-21 21:18:27 -03:00
oobabooga
cb0dbffccc Merge branch 'main' into dev 2023-11-21 16:12:45 -08:00
oobabooga
8d811a4d58 one-click: move on instead of crashing if extension fails to install 2023-11-21 16:09:44 -08:00
oobabooga
0589ff5b12
Bump llama-cpp-python to 0.2.19 & add min_p and typical_p parameters to llama.cpp loader (#4701) 2023-11-21 20:59:39 -03:00
oobabooga
2769a1fa25 Hide deprecated args from Session tab 2023-11-21 15:15:16 -08:00
oobabooga
0047d9f5e0 Do not install coqui_tts requirements by default
It breaks the one-click installer on Windows.
2023-11-21 15:13:42 -08:00
oobabooga
fb124ab6e2 Bump to flash-attention 2.3.4 + switch to Github Actions wheels on Windows (#4700) 2023-11-21 15:07:17 -08:00
oobabooga
e9cdaa2ada
Bump to flash-attention 2.3.4 + switch to Github Actions wheels on Windows (#4700) 2023-11-21 20:06:56 -03:00
oobabooga
b81d6ad8a4
Detect Orca 2 template (#4697) 2023-11-21 15:26:42 -03:00
oobabooga
360eeb9ff1
Merge pull request #4686 from oobabooga/dev
Merge dev branch
2023-11-21 08:38:50 -03:00
oobabooga
54a4eb60a3
Remove --no-dependencies from TTS installation command 2023-11-21 08:30:50 -03:00
oobabooga
efdd99623c
Merge pull request #4683 from oobabooga/dev
Merge dev branch
2023-11-21 00:36:58 -03:00
oobabooga
b02dc4dc0d Add --no-dependencies to TTS installation command 2023-11-20 19:02:12 -08:00
oobabooga
55f2a3643b Update multimodal API example 2023-11-20 18:41:09 -08:00
oobabooga
829c6d4f78 Add "remove_trailing_dots" option to XTTSv2 2023-11-20 18:33:29 -08:00
kanttouchthis
8dc9ec3491
add XTTSv2 (coqui_tts extension) (#4673)
---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-11-20 22:37:52 -03:00
oobabooga
ff24648510 Credit llama-cpp-python in the README 2023-11-20 12:13:15 -08:00
oobabooga
be78d79811 Revert accidental noavx2 changes 2023-11-20 11:48:04 -08:00
oobabooga
4b84e45116 Use +cpuavx2 instead of +cpuavx 2023-11-20 11:46:38 -08:00
oobabooga
d7f1bc102b
Fix "Illegal instruction" bug in llama.cpp CPU only version (#4677) 2023-11-20 16:36:38 -03:00
drew9781
5e70263e25
docker: install xformers with sepcific cuda version, matching the docker image. (#4670) 2023-11-19 21:43:15 -03:00
oobabooga
f11092ac2a
Merge pull request #4664 from oobabooga/dev
Merge dev branch
2023-11-19 15:12:55 -03:00
oobabooga
f0d66cf817 Add missing file 2023-11-19 10:12:13 -08:00
oobabooga
22e7a22d1e
Merge pull request #4662 from oobabooga/dev
Merge dev branch
2023-11-19 14:23:19 -03:00
oobabooga
a2e6d00128 Use convert_ids_to_tokens instead of decode in logits endpoint
This preserves the llama tokenizer spaces.
2023-11-19 09:22:08 -08:00