oobabooga
|
a060908d6c
|
Mixtral Instruct: detect prompt format for llama.cpp loader
Workaround until the tokenizer.chat_template kv field gets implemented
|
2023-12-15 06:59:15 -08:00 |
|
oobabooga
|
3bbf6c601d
|
AutoGPTQ: Add --disable_exllamav2 flag (Mixtral CPU offloading needs this)
|
2023-12-15 06:46:13 -08:00 |
|
oobabooga
|
7de10f4c8e
|
Bump AutoGPTQ to 0.6.0 (adds Mixtral support)
|
2023-12-15 06:18:49 -08:00 |
|
oobabooga
|
d0677caf2c
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2023-12-15 04:51:41 -08:00 |
|
oobabooga
|
69ba3cb0d9
|
Bump openai-whisper requirement (closes #4848)
|
2023-12-15 04:48:04 -08:00 |
|
Song Fuchang
|
127c71a22a
|
Update IPEX to 2.1.10+xpu (#4931)
* This will require Intel oneAPI Toolkit 2024.0
|
2023-12-15 03:19:01 -03:00 |
|
oobabooga
|
85816898f9
|
Bump llama-cpp-python to 0.2.23 (including Linux ROCm and MacOS >= 12) (#4930)
|
2023-12-15 01:58:08 -03:00 |
|
oobabooga
|
2cb5b68ad9
|
Bug fix: when generation fails, save the sent message (#4915)
|
2023-12-15 01:01:45 -03:00 |
|
Felipe Ferreira
|
11f082e417
|
[OpenAI Extension] Add more types to Embeddings Endpoint (#4895)
|
2023-12-15 00:26:16 -03:00 |
|
Kim Jaewon
|
e53f99faa0
|
[OpenAI Extension] Add 'max_logits' parameter in logits endpoint (#4916)
|
2023-12-15 00:22:43 -03:00 |
|
oobabooga
|
eaa1fe67f3
|
Remove elevenlabs extension (#4928)
|
2023-12-15 00:00:07 -03:00 |
|
oobabooga
|
c3e0fcfc52
|
Merge pull request #4927 from oobabooga/dev
Merge dev branch
|
2023-12-14 22:39:08 -03:00 |
|
oobabooga
|
f336f8a811
|
Merge branch 'main' into dev
|
2023-12-14 17:38:16 -08:00 |
|
oobabooga
|
dde7921057
|
One-click installer: minor message change
|
2023-12-14 17:27:32 -08:00 |
|
oobabooga
|
fd1449de20
|
One-click installer: fix minor bug introduced in previous commit
|
2023-12-14 16:52:44 -08:00 |
|
oobabooga
|
4ae2dcebf5
|
One-click installer: more friendly progress messages
|
2023-12-14 16:48:00 -08:00 |
|
oobabooga
|
8acecf3aee
|
Bump llama-cpp-python to 0.2.23 (NVIDIA & CPU-only, no AMD, no Metal) (#4924)
|
2023-12-14 09:41:36 -08:00 |
|
oobabooga
|
8835ea3704
|
Bump llama-cpp-python to 0.2.23 (NVIDIA & CPU-only, no AMD, no Metal) (#4924)
|
2023-12-14 14:39:43 -03:00 |
|
oobabooga
|
bf68d4499e
|
Merge pull request #4923 from oobabooga/dev
Merge dev branch
|
2023-12-14 13:01:05 -03:00 |
|
oobabooga
|
623c92792a
|
Update README
|
2023-12-14 07:56:48 -08:00 |
|
oobabooga
|
3580bed041
|
Update README
|
2023-12-14 07:54:20 -08:00 |
|
oobabooga
|
e91c09b8af
|
Merge pull request #4920 from oobabooga/dev
Merge dev branch
|
2023-12-14 11:24:00 -03:00 |
|
oobabooga
|
d5ec3c3444
|
Update README
|
2023-12-14 06:20:52 -08:00 |
|
oobabooga
|
5b283fff22
|
Update README
|
2023-12-14 06:15:14 -08:00 |
|
oobabooga
|
958799221f
|
Update README
|
2023-12-14 06:09:03 -08:00 |
|
oobabooga
|
e7fa17740a
|
Update README
|
2023-12-13 22:49:42 -08:00 |
|
oobabooga
|
03babe7d81
|
Update README
|
2023-12-13 22:47:08 -08:00 |
|
oobabooga
|
aad14174e4
|
Update README
|
2023-12-13 22:46:18 -08:00 |
|
oobabooga
|
783947a2aa
|
Update README
|
2023-12-13 22:44:25 -08:00 |
|
oobabooga
|
7fef16950f
|
Update README
|
2023-12-13 22:42:54 -08:00 |
|
oobabooga
|
d36e7f1762
|
Update README
|
2023-12-13 22:35:22 -08:00 |
|
oobabooga
|
9695db0ee4
|
Update README
|
2023-12-13 22:30:31 -08:00 |
|
oobabooga
|
d354f5009c
|
Update README
|
2023-12-13 22:21:29 -08:00 |
|
oobabooga
|
0a4fad2d46
|
Update README
|
2023-12-13 22:20:37 -08:00 |
|
oobabooga
|
fade6abfe9
|
Update README
|
2023-12-13 22:18:40 -08:00 |
|
oobabooga
|
aafd15109d
|
Update README
|
2023-12-13 22:15:58 -08:00 |
|
oobabooga
|
634518a412
|
Update README
|
2023-12-13 22:08:41 -08:00 |
|
oobabooga
|
0d5ca05ab9
|
Update README
|
2023-12-13 22:06:04 -08:00 |
|
oobabooga
|
d241de86c4
|
Update README
|
2023-12-13 22:02:26 -08:00 |
|
Lounger
|
5754f0c357
|
Fix deleting chat logs (#4914)
|
2023-12-13 21:54:43 -03:00 |
|
Bartowski
|
f51156705d
|
Allow symlinked folder within root directory (#4863)
|
2023-12-13 18:08:21 -03:00 |
|
oobabooga
|
36e850fe89
|
Update README.md
|
2023-12-13 17:55:41 -03:00 |
|
oobabooga
|
3e0c11a758
|
Merge pull request #4912 from oobabooga/dev
Merge dev branch
|
2023-12-13 15:49:36 -03:00 |
|
oobabooga
|
1bfee1d12e
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2023-12-13 10:48:34 -08:00 |
|
oobabooga
|
d14d4cad4a
|
Lint
|
2023-12-13 10:48:15 -08:00 |
|
Ixion
|
3f3960dbfb
|
Fixed invalid Jinja2 syntax in instruction templates (#4911)
|
2023-12-13 15:46:23 -03:00 |
|
oobabooga
|
4eeac70af7
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2023-12-13 10:40:41 -08:00 |
|
oobabooga
|
fcf5512364
|
Jinja templates: fix a potential small bug
|
2023-12-13 10:19:39 -08:00 |
|
missionfloyd
|
bdcc769e6f
|
Bypass coqui TTS EULA check (#4905)
|
2023-12-13 02:26:46 -03:00 |
|
oobabooga
|
7f1a6a70e3
|
Update the llamacpp_HF comment
|
2023-12-12 21:04:20 -08:00 |
|