oobabooga
|
0fee18e8b7
|
Rename some functions
|
2023-09-22 12:08:05 -07:00 |
|
oobabooga
|
6c5f81f002
|
Rename webui.py to one_click.py
|
2023-09-22 12:00:06 -07:00 |
|
oobabooga
|
fe2acdf45f
|
Update README.md
|
2023-09-22 15:52:20 -03:00 |
|
oobabooga
|
967dda17a0
|
Remove OOBABOOGA_FLAGS
|
2023-09-22 10:52:52 -07:00 |
|
oobabooga
|
ccfc919016
|
Make webui.py more readable
|
2023-09-22 10:51:29 -07:00 |
|
oobabooga
|
c74326de02
|
Fixes by @jllllll
|
2023-09-22 10:37:22 -07:00 |
|
oobabooga
|
b4b5f45558
|
Join the installation instructions
|
2023-09-22 10:28:22 -07:00 |
|
oobabooga
|
2d2a8cfb48
|
Remove a file
|
2023-09-22 10:08:08 -07:00 |
|
oobabooga
|
3314b7d795
|
Allow start scripts to have command-line flags
|
2023-09-22 10:03:56 -07:00 |
|
oobabooga
|
d43d150b1e
|
Fix a bug in the chat API (closes #4034)
|
2023-09-22 09:40:07 -07:00 |
|
oobabooga
|
8ab3eca9ec
|
Add a warning for outdated installations
|
2023-09-22 09:35:19 -07:00 |
|
oobabooga
|
86648d4085
|
Remove CUDA, keep only pytorch
|
2023-09-22 08:13:11 -07:00 |
|
oobabooga
|
66363a4d70
|
Minor changes / reorder some functions
|
2023-09-22 08:02:21 -07:00 |
|
oobabooga
|
84b5a519cb
|
Merge pull request #4029 from jllllll/one-click
Various one-click-installer updates and fixes
|
2023-09-22 11:55:01 -03:00 |
|
oobabooga
|
02e771403b
|
Improve the default character
|
2023-09-22 07:23:33 -07:00 |
|
oobabooga
|
95976a9d4f
|
Fix a bug while deleting characters
|
2023-09-22 06:02:34 -07:00 |
|
jllllll
|
69b0aedd95
|
Fix missing models warning
|
2023-09-22 01:12:08 -05:00 |
|
jllllll
|
060bb76aa0
|
Update WSL installer
|
2023-09-22 01:10:30 -05:00 |
|
oobabooga
|
ee7bf49804
|
Change back list style
|
2023-09-21 21:09:22 -07:00 |
|
jllllll
|
9054c98eca
|
Use --autostash on git pull
|
2023-09-21 23:00:33 -05:00 |
|
oobabooga
|
12e312ae9c
|
Focus on the chat input always
|
2023-09-21 20:32:24 -07:00 |
|
jllllll
|
498552a92b
|
More robust installation check for installer
|
2023-09-21 22:23:23 -05:00 |
|
jllllll
|
cd1049eded
|
Add Conda env deactivation to installer scripts
Avoids conflicts with existing Conda installations
|
2023-09-21 21:52:29 -05:00 |
|
jllllll
|
6bbfc40d10
|
Add .git creation to installer
|
2023-09-21 21:51:58 -05:00 |
|
oobabooga
|
d5330406fa
|
Add a rename menu for chat histories
|
2023-09-21 19:16:51 -07:00 |
|
oobabooga
|
d6814d7c15
|
Fix a bug in the API (closes #4027)
|
2023-09-21 17:54:53 -07:00 |
|
oobabooga
|
193fe18c8c
|
Resolve conflicts
|
2023-09-21 17:45:11 -07:00 |
|
oobabooga
|
df39f455ad
|
Merge remote-tracking branch 'second-repo/main' into merge-second-repo
|
2023-09-21 17:39:54 -07:00 |
|
oobabooga
|
fc2b831692
|
Basic changes
|
2023-09-21 15:55:09 -07:00 |
|
oobabooga
|
b04b3957f9
|
Move one-click-installers into the repository
|
2023-09-21 15:35:53 -07:00 |
|
oobabooga
|
05c4a4f83c
|
Bump exllamav2
|
2023-09-21 14:56:01 -07:00 |
|
oobabooga
|
9a5ab454b4
|
Improve list styles
|
2023-09-21 14:49:00 -07:00 |
|
oobabooga
|
00ab450c13
|
Multiple histories for each character (#4022)
|
2023-09-21 17:19:32 -03:00 |
|
oobabooga
|
029da9563f
|
Avoid redundant function call in llamacpp_hf
|
2023-09-19 14:14:40 -07:00 |
|
oobabooga
|
9b7646140c
|
Trim model path if using absolute path
|
2023-09-19 13:51:57 -07:00 |
|
oobabooga
|
869f47fff9
|
Lint
|
2023-09-19 13:51:57 -07:00 |
|
oobabooga
|
13ac55fa18
|
Reorder some functions
|
2023-09-19 13:51:57 -07:00 |
|
oobabooga
|
e2fddd9584
|
More robust autoscrolling (attempt)
|
2023-09-19 13:12:34 -07:00 |
|
oobabooga
|
03dc69edc5
|
ExLlama_HF (v1 and v2) prefix matching
|
2023-09-19 13:12:19 -07:00 |
|
oobabooga
|
5075087461
|
Fix command-line arguments being ignored
|
2023-09-19 13:11:46 -07:00 |
|
oobabooga
|
ff5d3d2d09
|
Add missing import
|
2023-09-18 16:26:54 -07:00 |
|
oobabooga
|
605ec3c9f2
|
Add a warning about ExLlamaV2 without flash-attn
|
2023-09-18 12:26:35 -07:00 |
|
oobabooga
|
f0ef971edb
|
Remove obsolete warning
|
2023-09-18 12:25:10 -07:00 |
|
oobabooga
|
745807dc03
|
Faster llamacpp_HF prefix matching
|
2023-09-18 11:02:45 -07:00 |
|
BadisG
|
893a72a1c5
|
Stop generation immediately when using "Maximum tokens/second" (#3952)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2023-09-18 14:27:06 -03:00 |
|
jllllll
|
b7c55665c1
|
Bump llama-cpp-python to 0.2.6 (#3982)
|
2023-09-18 14:08:37 -03:00 |
|
Cebtenzzre
|
8466cf229a
|
llama.cpp: fix ban_eos_token (#3987)
|
2023-09-18 12:15:02 -03:00 |
|
oobabooga
|
0ede2965d5
|
Remove an error message
|
2023-09-17 18:46:08 -07:00 |
|
dependabot[bot]
|
661bfaac8e
|
Update accelerate from ==0.22.* to ==0.23.* (#3981)
|
2023-09-17 22:42:12 -03:00 |
|
Chenxiao Wang
|
347aed4254
|
extensions/openai: load extension settings via settings.yaml (#3953)
|
2023-09-17 22:39:29 -03:00 |
|