Commit Graph

1210 Commits

Author SHA1 Message Date
oobabooga
8b442305ac Rename another variable 2023-04-03 01:15:20 -03:00
oobabooga
08448fb637 Rename a variable 2023-04-03 01:02:11 -03:00
oobabooga
2a267011dc Use Path.stem for simplicity 2023-04-03 00:56:14 -03:00
Alex "mcmonkey" Goodwin
ea97303509
Apply dialogue format in all character fields not just example dialogue (#650) 2023-04-02 21:54:29 -03:00
oobabooga
525f729b8e
Update README.md 2023-04-02 21:12:41 -03:00
oobabooga
53084241b4
Update README.md 2023-04-02 20:50:06 -03:00
TheTerrasque
2157bb4319
New yaml character format (#337 from TheTerrasque/feature/yaml-characters)
This doesn't break backward compatibility with JSON characters.
2023-04-02 20:34:25 -03:00
oobabooga
7ce608d101
Merge pull request #732 from StefanDanielSchwarz/fix-verbose-(beam-search)-preset
Fix "Verbose (Beam Search)" preset
2023-04-02 19:38:11 -03:00
SDS
34c3b4af6e
Fix "Verbose (Beam Search)" preset
Just a quick fix that removes an erroneous space between "length_penalty" and "=" (doesn't affect Python, but makes it possible to source the file from Bash, e. g. to use the variables with API calls)
2023-04-03 00:31:58 +02:00
oobabooga
1a823aaeb5
Clear text input for chat (#715 from bmoconno/clear-chat-input) 2023-04-02 18:08:25 -03:00
oobabooga
0dc6fa038b Use gr.State() to store the user input 2023-04-02 18:05:21 -03:00
oobabooga
5f3f3faa96 Better handle CUDA out of memory errors in chat mode 2023-04-02 17:48:00 -03:00
Brian O'Connor
d0f9625f0b Clear text input for chat
Add logic to clear the textbox for chat input when the user submits or hits the generate button.
2023-04-01 21:48:24 -04:00
oobabooga
b0890a7925 Add shared.is_chat() function 2023-04-01 20:15:00 -03:00
oobabooga
b38ba230f4
Update download-model.py 2023-04-01 15:03:24 -03:00
oobabooga
b6f817be45
Update README.md 2023-04-01 14:54:10 -03:00
oobabooga
88fa38ac01
Update README.md 2023-04-01 14:49:03 -03:00
oobabooga
526d5725db
Update download-model.py 2023-04-01 14:47:47 -03:00
oobabooga
4b57bd0d99
Update README.md 2023-04-01 14:38:04 -03:00
oobabooga
b53bec5a1f
Update README.md 2023-04-01 14:37:35 -03:00
oobabooga
9160586c04
Update README.md 2023-04-01 14:31:10 -03:00
oobabooga
7ec11ae000
Update README.md 2023-04-01 14:15:19 -03:00
oobabooga
b857f4655b
Update shared.py 2023-04-01 13:56:47 -03:00
oobabooga
012f4f83b8
Update README.md 2023-04-01 13:55:15 -03:00
oobabooga
fcda3f8776 Add also_return_rows to generate_chat_prompt 2023-04-01 01:12:13 -03:00
oobabooga
8c51b405e4 Progress towards generalizing Interface mode tab 2023-03-31 23:41:10 -03:00
oobabooga
23116b88ef
Add support for resuming downloads (#654 from nikita-skakun/support-partial-downloads) 2023-03-31 22:55:55 -03:00
oobabooga
74462ac713 Don't override the metadata when checking the sha256sum 2023-03-31 22:52:52 -03:00
oobabooga
2c52310642 Add --threads flag for llama.cpp 2023-03-31 21:18:05 -03:00
oobabooga
eeafd60713 Fix streaming 2023-03-31 19:05:38 -03:00
oobabooga
52065ae4cd Add repetition_penalty 2023-03-31 19:01:34 -03:00
oobabooga
2259143fec Fix llama.cpp with --no-stream 2023-03-31 18:43:45 -03:00
oobabooga
875de5d983 Update ggml template 2023-03-31 17:57:31 -03:00
oobabooga
cbfe0b944a
Update README.md 2023-03-31 17:49:11 -03:00
oobabooga
6a44f4aec6 Add support for downloading ggml files 2023-03-31 17:33:42 -03:00
oobabooga
3a47a602a3 Detect ggml*.bin files automatically 2023-03-31 17:18:21 -03:00
oobabooga
0aee7341d8 Properly count tokens/s for llama.cpp in chat mode 2023-03-31 17:04:32 -03:00
oobabooga
5c4e44b452
llama.cpp documentation 2023-03-31 15:20:39 -03:00
oobabooga
6fd70d0032
Add llama.cpp support (#447 from thomasantony/feature/llamacpp)
Documentation: https://github.com/oobabooga/text-generation-webui/wiki/llama.cpp-models
2023-03-31 15:17:32 -03:00
oobabooga
a5c9b7d977 Bump llamacpp version 2023-03-31 15:08:01 -03:00
oobabooga
ea3ba6fc73 Merge branch 'feature/llamacpp' of github.com:thomasantony/text-generation-webui into thomasantony-feature/llamacpp 2023-03-31 14:45:53 -03:00
oobabooga
09b0a3aafb Add repetition_penalty 2023-03-31 14:45:17 -03:00
oobabooga
4d98623041
Merge branch 'main' into feature/llamacpp 2023-03-31 14:37:04 -03:00
oobabooga
4c27562157 Minor changes 2023-03-31 14:33:46 -03:00
oobabooga
9d1dcf880a General improvements 2023-03-31 14:27:01 -03:00
oobabooga
770ff0efa9 Merge branch 'main' of github.com:oobabooga/text-generation-webui 2023-03-31 12:22:22 -03:00
oobabooga
1d1d9e40cd Add seed to settings 2023-03-31 12:22:07 -03:00
oobabooga
daeab6bac7
Merge pull request #678 from mayaeary/fix/python3.8
Fix `type object is not subscriptable`
2023-03-31 12:19:06 -03:00
oobabooga
5a6f939f05 Change the preset here too 2023-03-31 10:43:05 -03:00
Maya
b246d17513
Fix type object is not subscriptable
Fix `type object is not subscriptable` on python 3.8
2023-03-31 14:20:31 +03:00