oobabooga
b74bf5638b
Install extensions dependencies before webui dependencies
...
webui takes precedence over extensions.
2023-08-14 09:15:25 -07:00
oobabooga
d8a82d34ed
Improve a warning
2023-08-14 08:46:05 -07:00
oobabooga
3e0a9f9cdb
Refresh the character dropdown when saving/deleting a character
2023-08-14 08:23:41 -07:00
oobabooga
890b4abdad
Fix session saving
2023-08-14 07:55:52 -07:00
oobabooga
619cb4e78b
Add "save defaults to settings.yaml" button ( #3574 )
2023-08-14 11:46:07 -03:00
oobabooga
a95e6f02cb
Add a placeholder for custom stopping strings
2023-08-13 21:17:20 -07:00
oobabooga
ff9b5861c8
Fix impersonate when some text is present ( closes #3564 )
2023-08-13 21:10:47 -07:00
oobabooga
cc7e6ef645
Fix a CSS conflict
2023-08-13 19:24:09 -07:00
Eve
66c04c304d
Various ctransformers fixes ( #3556 )
...
---------
Co-authored-by: cal066 <cal066@users.noreply.github.com>
2023-08-13 23:09:03 -03:00
oobabooga
b8df4a436e
Scroll up when switching tabs
2023-08-13 18:48:15 -07:00
oobabooga
c269214219
CSS change to make buttons smaller
2023-08-13 18:45:13 -07:00
oobabooga
4a05aa92cb
Add "send to" buttons for instruction templates
...
- Remove instruction templates from prompt dropdowns (default/notebook)
- Add 3 buttons to Parameters > Instruction template as a replacement
- Increase the number of lines of 'negative prompt' field to 3, and add a scrollbar
- When uploading a character, switch to the Character tab
- When uploading chat history, switch to the Chat tab
2023-08-13 18:35:45 -07:00
oobabooga
3ae2cee446
Fix empty space when the gallery is hidden
2023-08-13 06:09:27 -07:00
oobabooga
f6db2c78d1
Fix ctransformers seed
2023-08-13 05:48:53 -07:00
oobabooga
919a3cf9d0
Fix the gallery
2023-08-13 05:43:09 -07:00
oobabooga
689f264979
Fix permission
2023-08-12 21:14:37 -07:00
oobabooga
f7ad634634
Remove --chat flag
2023-08-12 21:13:50 -07:00
oobabooga
a1a9ec895d
Unify the 3 interface modes ( #3554 )
2023-08-13 01:12:15 -03:00
cal066
bf70c19603
ctransformers: move thread and seed parameters ( #3543 )
2023-08-13 00:04:03 -03:00
jllllll
73421b1fed
Bump ctransformers wheel version ( #3558 )
2023-08-12 23:02:47 -03:00
Chris Lefever
0230fa4e9c
Add the --disable_exllama option for AutoGPTQ
2023-08-12 02:26:58 -04:00
oobabooga
0e05818266
Style changes
2023-08-11 16:35:57 -07:00
oobabooga
4c450e6b70
Update README.md
2023-08-11 15:50:16 -03:00
oobabooga
2f918ccf7c
Remove unused parameter
2023-08-11 11:15:22 -07:00
oobabooga
28c8df337b
Add repetition_penalty_range to ctransformers
2023-08-11 11:04:19 -07:00
cal066
7a4fcee069
Add ctransformers support ( #3313 )
...
---------
Co-authored-by: cal066 <cal066@users.noreply.github.com>
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
Co-authored-by: randoentity <137087500+randoentity@users.noreply.github.com>
2023-08-11 14:41:33 -03:00
oobabooga
8dbaa20ca8
Don't replace last reply with an empty message
2023-08-10 13:14:48 -07:00
oobabooga
949c92d7df
Create README.md
2023-08-10 14:32:40 -03:00
oobabooga
0789554f65
Allow --lora to use an absolute path
2023-08-10 10:03:12 -07:00
oobabooga
3929971b66
Don't show oobabooga_llama-tokenizer in the model dropdown
2023-08-10 10:02:48 -07:00
Gennadij
e12a1852d9
Add Vicuna-v1.5 detection ( #3524 )
2023-08-10 13:42:24 -03:00
jllllll
28e3ce4317
Simplify GPTQ-for-LLaMa installation ( #122 )
2023-08-10 13:19:47 -03:00
oobabooga
e3d2ddd170
Streamline GPTQ-for-LLaMa support ( #3526 from jllllll/gptqllama)
2023-08-10 12:54:59 -03:00
oobabooga
c7f52bbdc1
Revert "Remove GPTQ-for-LLaMa monkey patch support"
...
This reverts commit e3d3565b2a
.
2023-08-10 08:39:41 -07:00
oobabooga
16e2b117b4
Minor doc change
2023-08-10 08:38:10 -07:00
jllllll
d6765bebc4
Update installation documentation
2023-08-10 00:53:48 -05:00
jllllll
d7ee4c2386
Remove unused import
2023-08-10 00:10:14 -05:00
jllllll
e3d3565b2a
Remove GPTQ-for-LLaMa monkey patch support
...
AutoGPTQ will be the preferred GPTQ LoRa loader in the future.
2023-08-09 23:59:04 -05:00
jllllll
bee73cedbd
Streamline GPTQ-for-LLaMa support
2023-08-09 23:42:34 -05:00
oobabooga
a3295dd666
Detect n_gqa and prompt template for wizardlm-70b
2023-08-09 10:51:16 -07:00
oobabooga
a4e48cbdb6
Bump AutoGPTQ
2023-08-09 08:31:17 -07:00
oobabooga
7c1300fab5
Pin aiofiles version to fix statvfs issue
2023-08-09 08:07:55 -07:00
oobabooga
6c6a52aaad
Change the filenames for caches and histories
2023-08-09 07:47:19 -07:00
oobabooga
2255349f19
Update README
2023-08-09 05:46:25 -07:00
GiganticPrime
5bfcfcfc5a
Added the logic for starchat model series ( #3185 )
2023-08-09 09:26:12 -03:00
oobabooga
fa4a948b38
Allow users to write one flag per line in CMD_FLAGS.txt
2023-08-09 01:58:23 -03:00
oobabooga
d8fb506aff
Add RoPE scaling support for transformers (including dynamic NTK)
...
https://github.com/huggingface/transformers/pull/24653
2023-08-08 21:25:48 -07:00
Hans Raaf
f4caaf337a
Fix superbooga when using regenerate ( #3362 )
2023-08-08 23:26:28 -03:00
Friedemann Lipphardt
901b028d55
Add option for named cloudflare tunnels ( #3364 )
2023-08-08 22:20:27 -03:00
oobabooga
4ba30f6765
Add OpenChat template
2023-08-08 14:10:04 -07:00