oobabooga
|
3d854ee516
|
Pin PyTorch version to 2.1 (#5056)
|
2024-01-04 23:50:23 -03:00 |
|
Matthew Raaff
|
c9c31f71b8
|
Various one-click installer improvements (#4994)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2024-01-04 23:41:54 -03:00 |
|
oobabooga
|
c9d814592e
|
Increase maximum temperature value to 5
|
2024-01-04 17:28:15 -08:00 |
|
Guanghua Lu
|
3bb4b0504e
|
Close the menu on second click. (#5110)
|
2024-01-04 13:52:11 -03:00 |
|
oobabooga
|
e4d724eb3f
|
Fix cache_folder bug introduced in 37eff915d6
|
2024-01-04 07:49:40 -08:00 |
|
Alberto Cano
|
37eff915d6
|
Use --disk-cache-dir for all caches
|
2024-01-04 00:27:26 -03:00 |
|
oobabooga
|
c54d1daaaa
|
Merge pull request #5163 from oobabooga/dev
Merge dev branch
|
2024-01-03 22:57:00 -03:00 |
|
Lounger
|
7965f6045e
|
Fix loading latest history for file names with dots (#5162)
|
2024-01-03 22:39:41 -03:00 |
|
Adam Florizone
|
894e1a0700
|
Docker: added build args for non AVX2 CPU (#5154)
|
2024-01-03 20:43:02 -03:00 |
|
AstrisCantCode
|
b80e6365d0
|
Fix various bugs for LoRA training (#5161)
|
2024-01-03 20:42:20 -03:00 |
|
oobabooga
|
f6a204d7c9
|
Bump llama-cpp-python to 0.2.26
|
2024-01-03 11:06:36 -08:00 |
|
oobabooga
|
3a6cba9021
|
Add top_k=1 to Debug-deterministic preset
Makes it work with llama.cpp
|
2024-01-02 15:54:56 -08:00 |
|
oobabooga
|
3f28925a8d
|
Merge pull request #5152 from oobabooga/dev
Merge dev branch
|
2024-01-02 13:22:14 -03:00 |
|
oobabooga
|
7cce88c403
|
Rmove an unncecessary exception
|
2024-01-02 07:20:59 -08:00 |
|
oobabooga
|
90c7e84b01
|
UI: improve chat style margin for last bot message
|
2024-01-01 19:50:13 -08:00 |
|
oobabooga
|
a4b4708560
|
Decrease "Show controls" button opacity
|
2024-01-01 19:08:30 -08:00 |
|
oobabooga
|
94afa0f9cf
|
Minor style changes
|
2024-01-01 16:00:22 -08:00 |
|
oobabooga
|
3e3a66e721
|
Merge pull request #5132 from oobabooga/dev
Merge dev branch
|
2023-12-31 02:32:25 -03:00 |
|
oobabooga
|
cbf6f9e695
|
Update some UI messages
|
2023-12-30 21:31:17 -08:00 |
|
oobabooga
|
2aad91f3c9
|
Remove deprecated command-line flags (#5131)
|
2023-12-31 02:07:48 -03:00 |
|
TheInvisibleMage
|
485b85ee76
|
Superboogav2 Quick Fixes (#5089)
|
2023-12-31 02:03:23 -03:00 |
|
oobabooga
|
2734ce3e4c
|
Remove RWKV loader (#5130)
|
2023-12-31 02:01:40 -03:00 |
|
oobabooga
|
0e54a09bcb
|
Remove exllamav1 loaders (#5128)
|
2023-12-31 01:57:06 -03:00 |
|
oobabooga
|
8e397915c9
|
Remove --sdp-attention, --xformers flags (#5126)
|
2023-12-31 01:36:51 -03:00 |
|
B611
|
b7dd1f9542
|
Specify utf-8 encoding for model metadata file open (#5125)
|
2023-12-31 01:34:32 -03:00 |
|
oobabooga
|
20a2eaaf95
|
Add .vs to .gitignore
|
2023-12-27 12:58:07 -08:00 |
|
oobabooga
|
a4079e879e
|
CSS: don't change --chat-height when outside the chat tab
|
2023-12-27 11:51:55 -08:00 |
|
oobabooga
|
c419206ce1
|
Lint the JS/CSS
|
2023-12-27 09:59:23 -08:00 |
|
oobabooga
|
3fd7073808
|
Merge pull request #5100 from oobabooga/dev
Merge dev branch
|
2023-12-27 13:23:28 -03:00 |
|
oobabooga
|
648c2d1cc2
|
Update settings-template.yaml
|
2023-12-25 15:25:16 -08:00 |
|
oobabooga
|
c21e3d6300
|
Merge pull request #5044 from TheLounger/style_improvements
Improve chat styles
|
2023-12-25 20:00:50 -03:00 |
|
oobabooga
|
2ad6c526b8
|
Check if extensions block exists before changing it
|
2023-12-25 14:43:12 -08:00 |
|
oobabooga
|
63553b41ed
|
Improve some paddings
|
2023-12-25 14:25:31 -08:00 |
|
oobabooga
|
abd227594c
|
Fix a border radius
|
2023-12-25 14:17:00 -08:00 |
|
oobabooga
|
8d0359a6d8
|
Rename some CSS variables
|
2023-12-25 14:10:07 -08:00 |
|
oobabooga
|
5466ae59a7
|
Prevent input/chat area overlap with new --my-delta variable
|
2023-12-25 14:07:31 -08:00 |
|
oobabooga
|
19d13743a6
|
Merge pull request #5078 from oobabooga/dev
Merge dev branch
|
2023-12-25 17:23:01 -03:00 |
|
oobabooga
|
02d063fb9f
|
Fix extra space after 18ca35faaa
|
2023-12-25 08:38:17 -08:00 |
|
oobabooga
|
ae927950a8
|
Remove instruct style border radius
|
2023-12-25 08:35:33 -08:00 |
|
oobabooga
|
18ca35faaa
|
Space between chat tab and extensions block
|
2023-12-25 08:34:02 -08:00 |
|
oobabooga
|
73ba7a8921
|
Change height -> min-height for .chat
|
2023-12-25 08:32:02 -08:00 |
|
oobabooga
|
29b0f14d5a
|
Bump llama-cpp-python to 0.2.25 (#5077)
|
2023-12-25 12:36:32 -03:00 |
|
oobabooga
|
af876095e2
|
Merge pull request #5073 from oobabooga/dev
Merge dev branch
|
2023-12-25 02:58:45 -03:00 |
|
oobabooga
|
c06f630bcc
|
Increase max_updates_second maximum value
|
2023-12-24 13:29:47 -08:00 |
|
Casper
|
92d5e64a82
|
Bump AutoAWQ to 0.1.8 (#5061)
|
2023-12-24 14:27:34 -03:00 |
|
oobabooga
|
4aeebfc571
|
Merge branch 'dev' into TheLounger-style_improvements
|
2023-12-24 09:24:55 -08:00 |
|
oobabooga
|
d76b00c211
|
Pin lm_eval package version
|
2023-12-24 09:22:31 -08:00 |
|
oobabooga
|
8c60495878
|
UI: add "Maximum UI updates/second" parameter
|
2023-12-24 09:17:40 -08:00 |
|
zhangningboo
|
1b8b61b928
|
Fix output_ids decoding for Qwen/Qwen-7B-Chat (#5045)
|
2023-12-22 23:11:02 -03:00 |
|
kabachuha
|
dbe438564e
|
Support for sending images into OpenAI chat API (#4827)
|
2023-12-22 22:45:53 -03:00 |
|