oobabooga
|
4ea260098f
|
llama.cpp: add 4-bit/8-bit kv cache options
|
2024-06-29 09:10:33 -07:00 |
|
oobabooga
|
220c1797fc
|
UI: do not show the "save character" button in the Chat tab
|
2024-06-28 22:11:31 -07:00 |
|
oobabooga
|
f62aad3d59
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2024-06-28 21:42:03 -07:00 |
|
oobabooga
|
8803ae1845
|
UI: decrease the number of lines for "Command for chat-instruct mode"
|
2024-06-28 21:41:30 -07:00 |
|
mamei16
|
cc825dd1f4
|
Addressing Whisper STT issues (#5929)
|
2024-06-29 01:32:54 -03:00 |
|
oobabooga
|
5c6b9c610d
|
UI: allow the character dropdown to coexist in the Chat tab and the Parameters tab (#6177)
|
2024-06-29 01:20:27 -03:00 |
|
oobabooga
|
de69a62004
|
Revert "UI: move "Character" dropdown to the main Chat tab"
This reverts commit 83534798b2 .
|
2024-06-28 15:38:11 -07:00 |
|
oobabooga
|
38d58764db
|
UI: remove unused gr.State variable from the Default tab
|
2024-06-28 15:17:44 -07:00 |
|
oobabooga
|
04cb197ed6
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2024-06-27 21:25:23 -07:00 |
|
oobabooga
|
da196707cf
|
UI: improve the light theme a bit
|
2024-06-27 21:05:38 -07:00 |
|
dependabot[bot]
|
9660f6f10e
|
Bump aqlm[cpu,gpu] from 1.1.5 to 1.1.6 (#6157)
|
2024-06-27 21:13:02 -03:00 |
|
dependabot[bot]
|
a5df8f4e3c
|
Bump jinja2 from 3.1.2 to 3.1.4 (#6172)
|
2024-06-27 21:12:39 -03:00 |
|
dependabot[bot]
|
c6cec0588c
|
Update accelerate requirement from ==0.30.* to ==0.31.* (#6156)
|
2024-06-27 21:12:02 -03:00 |
|
oobabooga
|
1da47f2ae6
|
Make dependabot target the dev branch
|
2024-06-27 17:07:04 -07:00 |
|
oobabooga
|
9dbcb1aeea
|
Small fix to make transformers 4.42 functional
|
2024-06-27 17:05:29 -07:00 |
|
oobabooga
|
66090758df
|
Bump transformers to 4.42 (for gemma support)
|
2024-06-27 11:26:02 -07:00 |
|
oobabooga
|
8ec8bc0b85
|
UI: handle another edge case while streaming lists
|
2024-06-26 18:40:43 -07:00 |
|
oobabooga
|
0e138e4be1
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2024-06-26 18:30:08 -07:00 |
|
mefich
|
a85749dcbe
|
Update models_settings.py: add default alpha_value, add proper compress_pos_emb for newer GGUFs (#6111)
|
2024-06-26 22:17:56 -03:00 |
|
oobabooga
|
5fe532a5ce
|
UI: remove DRY info text
It was visible for loaders without DRY.
|
2024-06-26 15:33:11 -07:00 |
|
oobabooga
|
b1187fc9a5
|
UI: prevent flickering while streaming lists / bullet points
|
2024-06-25 19:19:45 -07:00 |
|
oobabooga
|
3691451d00
|
Add back the "Rename chat" feature (#6161)
|
2024-06-25 22:28:58 -03:00 |
|
oobabooga
|
53fbd2f245
|
Add TensorRT-LLM to the README
|
2024-06-25 14:45:37 -07:00 |
|
oobabooga
|
ac3f92d36a
|
UI: store chat history in the browser
|
2024-06-25 14:18:07 -07:00 |
|
oobabooga
|
46ca15cb79
|
Minor bug fixes after e7e1f5901e
|
2024-06-25 11:49:33 -07:00 |
|
oobabooga
|
83534798b2
|
UI: move "Character" dropdown to the main Chat tab
|
2024-06-25 11:25:57 -07:00 |
|
oobabooga
|
279cba607f
|
UI: don't show an animation when updating the "past chats" menu
|
2024-06-25 11:10:17 -07:00 |
|
oobabooga
|
3290edfad9
|
Bug fix: force chat history to be loaded on launch
|
2024-06-25 11:06:05 -07:00 |
|
oobabooga
|
e7e1f5901e
|
Prompts in the "past chats" menu (#6160)
|
2024-06-25 15:01:43 -03:00 |
|
oobabooga
|
602b455507
|
Bump llama-cpp-python to 0.2.79
|
2024-06-24 20:26:38 -07:00 |
|
oobabooga
|
a43c210617
|
Improved past chats menu (#6158)
|
2024-06-25 00:07:22 -03:00 |
|
oobabooga
|
96ba53d916
|
Handle another fix after 57119c1b30
|
2024-06-24 15:51:12 -07:00 |
|
oobabooga
|
7db8b3b532
|
Bump ExLlamaV2 to 0.1.6
|
2024-06-24 05:38:11 -07:00 |
|
oobabooga
|
35f32d08bc
|
GitHub: Increase the stalebot time to 6 months
|
2024-06-23 22:34:18 -07:00 |
|
oobabooga
|
564a3e1553
|
Remove the awkward "Tab" keyboard shortcut
|
2024-06-23 22:31:07 -07:00 |
|
oobabooga
|
577a8cd3ee
|
Add TensorRT-LLM support (#5715)
|
2024-06-24 02:30:03 -03:00 |
|
oobabooga
|
536f8d58d4
|
Do not expose alpha_value to llama.cpp & rope_freq_base to transformers
To avoid confusion
|
2024-06-23 22:09:24 -07:00 |
|
oobabooga
|
b48ab482f8
|
Remove obsolete "gptq_for_llama_info" message
|
2024-06-23 22:05:19 -07:00 |
|
oobabooga
|
5e8dc56f8a
|
Fix after previous commit
|
2024-06-23 21:58:28 -07:00 |
|
Louis Del Valle
|
57119c1b30
|
Update block_requests.py to resolve unexpected type error (500 error) (#5976)
|
2024-06-24 01:56:51 -03:00 |
|
oobabooga
|
125bb7b03b
|
Revert "Bump llama-cpp-python to 0.2.78"
This reverts commit b6eaf7923e .
|
2024-06-23 19:54:28 -07:00 |
|
CharlesCNorton
|
5993904acf
|
Fix several typos in the codebase (#6151)
|
2024-06-22 21:40:25 -03:00 |
|
GodEmperor785
|
2c5a9eb597
|
Change limits of RoPE scaling sliders in UI (#6142)
|
2024-06-19 21:42:17 -03:00 |
|
oobabooga
|
5904142777
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2024-06-19 17:41:09 -07:00 |
|
oobabooga
|
b10d735176
|
Minor CSS linting
|
2024-06-19 17:40:33 -07:00 |
|
Guanghua Lu
|
229d89ccfb
|
Make logs more readable, no more \u7f16\u7801 (#6127)
|
2024-06-15 23:00:13 -03:00 |
|
oobabooga
|
fd7c3c5bb0
|
Don't git pull on installation (to make past releases installable)
|
2024-06-15 06:38:05 -07:00 |
|
oobabooga
|
b6eaf7923e
|
Bump llama-cpp-python to 0.2.78
|
2024-06-14 21:22:09 -07:00 |
|
oobabooga
|
9420973b62
|
Downgrade PyTorch to 2.2.2 (#6124)
|
2024-06-14 16:42:03 -03:00 |
|
Forkoz
|
1576227f16
|
Fix GGUFs with no BOS token present, mainly qwen2 models. (#6119)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2024-06-14 13:51:01 -03:00 |
|