oobabooga
|
8a7a8343be
|
Detect TheBloke_WizardLM-30B-GPTQ
|
2023-06-09 00:26:34 -03:00 |
|
oobabooga
|
0f8140e99d
|
Bump transformers/accelerate/peft/autogptq
|
2023-06-09 00:25:13 -03:00 |
|
FartyPants
|
ac40c59ac3
|
Added Guanaco-QLoRA to Instruct character (#2574)
|
2023-06-08 12:24:32 -03:00 |
|
oobabooga
|
db2cbe7b5a
|
Detect WizardLM-30B-V1.0 instruction format
|
2023-06-08 11:43:40 -03:00 |
|
oobabooga
|
e0b43102e6
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2023-06-08 11:35:23 -03:00 |
|
matatonic
|
7be6fe126b
|
extensions/api: models api for blocking_api (updated) (#2539)
|
2023-06-08 11:34:36 -03:00 |
|
oobabooga
|
240752617d
|
Increase download timeout to 20s
|
2023-06-08 11:16:38 -03:00 |
|
zaypen
|
084b006cfe
|
Update LLaMA-model.md (#2460)
Better approach of converting LLaMA model
|
2023-06-07 15:34:50 -03:00 |
|
oobabooga
|
878250d609
|
Merge branch 'main' into dev
|
2023-06-06 19:43:53 -03:00 |
|
oobabooga
|
f55e85e28a
|
Fix multimodal with model loaded through AutoGPTQ
|
2023-06-06 19:42:40 -03:00 |
|
oobabooga
|
eb2601a8c3
|
Reorganize Parameters tab
|
2023-06-06 14:51:02 -03:00 |
|
oobabooga
|
3cc5ce3c42
|
Merge pull request #2551 from oobabooga/dev
|
2023-06-06 14:40:52 -03:00 |
|
oobabooga
|
6015616338
|
Style changes
|
2023-06-06 13:06:05 -03:00 |
|
oobabooga
|
f040073ef1
|
Handle the case of older autogptq install
|
2023-06-06 13:05:05 -03:00 |
|
oobabooga
|
5d515eeb8c
|
Bump llama-cpp-python wheel
|
2023-06-06 13:01:15 -03:00 |
|
oobabooga
|
bc58dc40bd
|
Fix a minor bug
|
2023-06-06 12:57:13 -03:00 |
|
oobabooga
|
f06a1387f0
|
Reorganize Models tab
|
2023-06-06 07:58:07 -03:00 |
|
oobabooga
|
d49d299b67
|
Change a message
|
2023-06-06 07:54:56 -03:00 |
|
oobabooga
|
f9b8bed953
|
Remove folder
|
2023-06-06 07:49:12 -03:00 |
|
oobabooga
|
90fdb8edc6
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2023-06-06 07:46:51 -03:00 |
|
oobabooga
|
7ed1e35fbf
|
Reorganize Parameters tab in chat mode
|
2023-06-06 07:46:25 -03:00 |
|
oobabooga
|
00b94847da
|
Remove softprompt support
|
2023-06-06 07:42:23 -03:00 |
|
bobzilla
|
643c44e975
|
Add ngrok shared URL ingress support (#1944)
|
2023-06-06 07:34:20 -03:00 |
|
oobabooga
|
ccb4c9f178
|
Add some padding to chat box
|
2023-06-06 07:21:16 -03:00 |
|
oobabooga
|
0aebc838a0
|
Don't save the history for 'None' character
|
2023-06-06 07:21:07 -03:00 |
|
oobabooga
|
9f215523e2
|
Remove some unused imports
|
2023-06-06 07:05:46 -03:00 |
|
oobabooga
|
b9bc9665d9
|
Remove some extra space
|
2023-06-06 07:01:37 -03:00 |
|
oobabooga
|
177ab7912a
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2023-06-06 07:01:00 -03:00 |
|
oobabooga
|
0f0108ce34
|
Never load the history for default character
|
2023-06-06 07:00:11 -03:00 |
|
oobabooga
|
ae25b21d61
|
Improve instruct style in dark mode
|
2023-06-06 07:00:00 -03:00 |
|
matatonic
|
4a17a5db67
|
[extensions/openai] various fixes (#2533)
|
2023-06-06 01:43:04 -03:00 |
|
dependabot[bot]
|
97f3fa843f
|
Bump llama-cpp-python from 0.1.56 to 0.1.57 (#2537)
|
2023-06-05 23:45:58 -03:00 |
|
oobabooga
|
11f38b5c2b
|
Add AutoGPTQ LoRA support
|
2023-06-05 23:32:57 -03:00 |
|
oobabooga
|
3a5cfe96f0
|
Increase chat_prompt_size_max
|
2023-06-05 17:37:37 -03:00 |
|
oobabooga
|
4e9937aa99
|
Bump gradio
|
2023-06-05 17:29:21 -03:00 |
|
pandego
|
0377e385e0
|
Update .gitignore (#2504)
add .idea to git ignore
|
2023-06-05 17:11:03 -03:00 |
|
oobabooga
|
60bfd0b722
|
Merge pull request #2535 from oobabooga/dev
Dev branch merge
|
2023-06-05 17:07:54 -03:00 |
|
oobabooga
|
eda224c92d
|
Update README
|
2023-06-05 17:04:09 -03:00 |
|
oobabooga
|
bef94b9ebb
|
Update README
|
2023-06-05 17:01:13 -03:00 |
|
oobabooga
|
99d701994a
|
Update GPTQ-models-(4-bit-mode).md
|
2023-06-05 15:55:00 -03:00 |
|
oobabooga
|
f276d88546
|
Use AutoGPTQ by default for GPTQ models
|
2023-06-05 15:41:48 -03:00 |
|
oobabooga
|
632571a009
|
Update README
|
2023-06-05 15:16:06 -03:00 |
|
oobabooga
|
6a75bda419
|
Assign some 4096 seq lengths
|
2023-06-05 12:07:52 -03:00 |
|
oobabooga
|
9b0e95abeb
|
Fix "regenerate" when "Start reply with" is set
|
2023-06-05 11:56:03 -03:00 |
|
oobabooga
|
e61316ce0b
|
Detect airoboros and Nous-Hermes
|
2023-06-05 11:52:13 -03:00 |
|
oobabooga
|
19f78684e6
|
Add "Start reply with" feature to chat mode
|
2023-06-02 13:58:08 -03:00 |
|
GralchemOz
|
f7b07c4705
|
Fix the missing Chinese character bug (#2497)
|
2023-06-02 13:45:41 -03:00 |
|
oobabooga
|
28198bc15c
|
Change some headers
|
2023-06-02 11:28:43 -03:00 |
|
oobabooga
|
5177cdf634
|
Change AutoGPTQ info
|
2023-06-02 11:19:44 -03:00 |
|
oobabooga
|
8e98633efd
|
Add a description for chat_prompt_size
|
2023-06-02 11:13:22 -03:00 |
|