Commit Graph

3962 Commits

Author SHA1 Message Date
oobabooga
ed4001e324 Bump ExLlamaV2 to 0.0.18 2024-04-08 18:05:16 -07:00
oobabooga
91a7370a65
Merge pull request #5823 from oobabooga/dev
Merge dev branch
2024-04-07 11:01:08 -03:00
oobabooga
f6828de3f2 Downgrade llama-cpp-python to 0.2.56 2024-04-07 07:00:12 -07:00
Jared Van Bortel
39ff9c9dcf
requirements: add psutil (#5819) 2024-04-06 23:02:20 -03:00
oobabooga
65099dc192
Merge pull request #5822 from oobabooga/dev
Merge dev branch
2024-04-06 22:58:06 -03:00
oobabooga
d02744282b Minor logging change 2024-04-06 18:56:58 -07:00
oobabooga
dfb01f9a63 Bump llama-cpp-python to 0.2.60 2024-04-06 18:32:36 -07:00
oobabooga
096f75a432 Documentation: remove obsolete RWKV docs 2024-04-06 14:06:39 -07:00
oobabooga
dd6e4ac55f Prevent double <BOS_TOKEN> with Command R+ 2024-04-06 13:14:32 -07:00
oobabooga
1bdceea2d4 UI: Focus on the chat input after starting a new chat 2024-04-06 12:57:57 -07:00
oobabooga
168a0f4f67 UI: do not load the "gallery" extension by default 2024-04-06 12:43:21 -07:00
oobabooga
64a76856bd Metadata: Fix loading Command R+ template with multiple options 2024-04-06 07:32:17 -07:00
oobabooga
1b87844928 Minor fix 2024-04-05 18:43:43 -07:00
oobabooga
6b7f7555fc Logging message to make transformers loader a bit more transparent 2024-04-05 18:40:02 -07:00
oobabooga
4e739dc211 Add an instruction template for Command R 2024-04-05 18:22:25 -07:00
oobabooga
8a8dbf2f16 Merge remote-tracking branch 'refs/remotes/origin/dev' into dev 2024-04-05 12:42:23 -07:00
oobabooga
0f536dd97d UI: Fix the "Show controls" action 2024-04-05 12:18:33 -07:00
dependabot[bot]
a4c67e1974
Bump aqlm[cpu,gpu] from 1.1.2 to 1.1.3 (#5790) 2024-04-05 13:26:49 -03:00
oobabooga
14f6194211 Bump Gradio to 4.25 2024-04-05 09:22:44 -07:00
oobabooga
5b91dbb73b
Merge pull request #5810 from oobabooga/dev
Merge dev branch
2024-04-05 10:55:16 -03:00
oobabooga
308452b783 Bitsandbytes: load preconverted 4bit models without additional flags 2024-04-04 18:10:24 -07:00
oobabooga
d423021a48
Remove CTransformers support (#5807) 2024-04-04 20:23:58 -03:00
oobabooga
13fe38eb27 Remove specialized code for gpt-4chan 2024-04-04 16:11:47 -07:00
oobabooga
3952560da8 Bump llama-cpp-python to 0.2.59 2024-04-04 11:20:48 -07:00
oobabooga
9ab7365b56 Read rope_theta for DBRX model (thanks turboderp) 2024-04-01 20:25:31 -07:00
oobabooga
db5f6cd1d8 Fix ExLlamaV2 loaders using unnecessary "bits" metadata 2024-03-30 21:51:39 -07:00
oobabooga
624faa1438 Fix ExLlamaV2 context length setting (closes #5750) 2024-03-30 21:33:16 -07:00
oobabooga
70c58b5fc2 Bump ExLlamaV2 to 0.0.17 2024-03-30 21:08:26 -07:00
oobabooga
1a7c027386
Merge pull request #5772 from oobabooga/dev
Merge dev branch
2024-03-29 15:09:53 -03:00
oobabooga
c37f792afa Better way to handle user_bio default in the API (alternative to bdcf31035f) 2024-03-29 10:54:01 -07:00
oobabooga
9653a9176c Minor improvements to Parameters tab 2024-03-29 10:41:24 -07:00
oobabooga
3ce0d9221b Bump transformers to 4.39 2024-03-28 19:40:31 -07:00
oobabooga
e0e28ecb0b Set the gradio 4 allowed_paths 2024-03-28 15:10:54 -07:00
oobabooga
723f912c16 Fix the "typing dots" position in latest Gradio version 2024-03-28 12:57:35 -07:00
oobabooga
35da6b989d
Organize the parameters tab (#5767) 2024-03-28 16:45:03 -03:00
dependabot[bot]
3609ea69e4
Bump aqlm[cpu,gpu] from 1.1.0 to 1.1.2 (#5728) 2024-03-26 16:36:16 -03:00
Bartowski
9ad116a6e2
Add config for hyperion and hercules models to use chatml (#5742) 2024-03-26 16:35:29 -03:00
wldhx
7cbafc0540
docker: Remove obsolete CLI_ARGS variable (#5726) 2024-03-26 16:34:53 -03:00
Yiximail
bdcf31035f
Set a default empty string for user_bio to fix #5717 issue (#5722) 2024-03-26 16:34:03 -03:00
Yiximail
8c9aca239a
Fix prompt incorrectly set to empty when suffix is empty string (#5757) 2024-03-26 16:33:09 -03:00
oobabooga
2a92a842ce
Bump gradio to 4.23 (#5758) 2024-03-26 16:32:20 -03:00
oobabooga
7cf1402bde
Merge pull request #5716 from oobabooga/dev
Merge dev branch
2024-03-17 12:34:53 -03:00
oobabooga
49b111e2dd Lint 2024-03-17 08:33:23 -07:00
oobabooga
d890c99b53 Fix StreamingLLM when content is removed from the beginning of the prompt 2024-03-14 09:18:54 -07:00
oobabooga
d828844a6f Small fix: don't save truncation_length to settings.yaml
It should derive from model metadata or from a command-line flag.
2024-03-14 08:56:28 -07:00
oobabooga
2ef5490a36 UI: make light theme less blinding 2024-03-13 08:23:16 -07:00
oobabooga
40a60e0297 Convert attention_sink_size to int (closes #5696) 2024-03-13 08:15:49 -07:00
oobabooga
edec3bf3b0 UI: avoid caching convert_to_markdown calls during streaming 2024-03-13 08:14:34 -07:00
oobabooga
8152152dd6 Small fix after 28076928ac 2024-03-11 19:56:35 -07:00
oobabooga
28076928ac
UI: Add a new "User description" field for user personality/biography (#5691) 2024-03-11 23:41:57 -03:00