FartyPants
|
76c212d322
|
Avoid using 'default' as adapter_name in LORA
|
2023-06-10 23:41:45 -04:00 |
|
oobabooga
|
ea0eabd266
|
Bump llama-cpp-python version
|
2023-06-10 21:59:29 -03:00 |
|
oobabooga
|
ec2b5bae39
|
Merge pull request #2616 from oobabooga/dev
Merge dev branch
|
2023-06-10 21:55:59 -03:00 |
|
brandonj60
|
b04e18d10c
|
Add Mirostat v2 sampling to transformer models (#2571)
|
2023-06-09 21:26:31 -03:00 |
|
oobabooga
|
aff3e04df4
|
Remove irrelevant docs
Compiling from source, in my tests, makes no difference in
the resulting tokens/s.
|
2023-06-09 21:15:37 -03:00 |
|
oobabooga
|
d7db25dac9
|
Fix a permission
|
2023-06-09 01:44:17 -03:00 |
|
oobabooga
|
d033c85cf9
|
Fix a permission
|
2023-06-09 01:43:22 -03:00 |
|
oobabooga
|
741afd74f6
|
Update requirements-minimal.txt
|
2023-06-09 00:48:41 -03:00 |
|
oobabooga
|
c333e4c906
|
Add docs for performance optimizations
|
2023-06-09 00:47:48 -03:00 |
|
oobabooga
|
aaf240a14c
|
Merge pull request #2587 from oobabooga/dev
|
2023-06-09 00:30:59 -03:00 |
|
oobabooga
|
c6552785af
|
Minor cleanup
|
2023-06-09 00:30:22 -03:00 |
|
oobabooga
|
92b45cb3f5
|
Merge branch 'main' into dev
|
2023-06-09 00:27:11 -03:00 |
|
oobabooga
|
8a7a8343be
|
Detect TheBloke_WizardLM-30B-GPTQ
|
2023-06-09 00:26:34 -03:00 |
|
oobabooga
|
0f8140e99d
|
Bump transformers/accelerate/peft/autogptq
|
2023-06-09 00:25:13 -03:00 |
|
FartyPants
|
ac40c59ac3
|
Added Guanaco-QLoRA to Instruct character (#2574)
|
2023-06-08 12:24:32 -03:00 |
|
oobabooga
|
db2cbe7b5a
|
Detect WizardLM-30B-V1.0 instruction format
|
2023-06-08 11:43:40 -03:00 |
|
oobabooga
|
e0b43102e6
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2023-06-08 11:35:23 -03:00 |
|
matatonic
|
7be6fe126b
|
extensions/api: models api for blocking_api (updated) (#2539)
|
2023-06-08 11:34:36 -03:00 |
|
oobabooga
|
240752617d
|
Increase download timeout to 20s
|
2023-06-08 11:16:38 -03:00 |
|
zaypen
|
084b006cfe
|
Update LLaMA-model.md (#2460)
Better approach of converting LLaMA model
|
2023-06-07 15:34:50 -03:00 |
|
dnobs
|
c05edfcdfc
|
fix: reverse-proxied URI should end with 'chat', not 'generate' (#2556)
|
2023-06-07 00:08:04 -03:00 |
|
oobabooga
|
878250d609
|
Merge branch 'main' into dev
|
2023-06-06 19:43:53 -03:00 |
|
oobabooga
|
f55e85e28a
|
Fix multimodal with model loaded through AutoGPTQ
|
2023-06-06 19:42:40 -03:00 |
|
oobabooga
|
eb2601a8c3
|
Reorganize Parameters tab
|
2023-06-06 14:51:02 -03:00 |
|
oobabooga
|
3cc5ce3c42
|
Merge pull request #2551 from oobabooga/dev
|
2023-06-06 14:40:52 -03:00 |
|
oobabooga
|
6015616338
|
Style changes
|
2023-06-06 13:06:05 -03:00 |
|
oobabooga
|
f040073ef1
|
Handle the case of older autogptq install
|
2023-06-06 13:05:05 -03:00 |
|
oobabooga
|
5d515eeb8c
|
Bump llama-cpp-python wheel
|
2023-06-06 13:01:15 -03:00 |
|
oobabooga
|
bc58dc40bd
|
Fix a minor bug
|
2023-06-06 12:57:13 -03:00 |
|
oobabooga
|
f06a1387f0
|
Reorganize Models tab
|
2023-06-06 07:58:07 -03:00 |
|
oobabooga
|
d49d299b67
|
Change a message
|
2023-06-06 07:54:56 -03:00 |
|
oobabooga
|
f9b8bed953
|
Remove folder
|
2023-06-06 07:49:12 -03:00 |
|
oobabooga
|
90fdb8edc6
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2023-06-06 07:46:51 -03:00 |
|
oobabooga
|
7ed1e35fbf
|
Reorganize Parameters tab in chat mode
|
2023-06-06 07:46:25 -03:00 |
|
oobabooga
|
00b94847da
|
Remove softprompt support
|
2023-06-06 07:42:23 -03:00 |
|
bobzilla
|
643c44e975
|
Add ngrok shared URL ingress support (#1944)
|
2023-06-06 07:34:20 -03:00 |
|
oobabooga
|
ccb4c9f178
|
Add some padding to chat box
|
2023-06-06 07:21:16 -03:00 |
|
oobabooga
|
0aebc838a0
|
Don't save the history for 'None' character
|
2023-06-06 07:21:07 -03:00 |
|
oobabooga
|
9f215523e2
|
Remove some unused imports
|
2023-06-06 07:05:46 -03:00 |
|
oobabooga
|
b9bc9665d9
|
Remove some extra space
|
2023-06-06 07:01:37 -03:00 |
|
oobabooga
|
177ab7912a
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2023-06-06 07:01:00 -03:00 |
|
oobabooga
|
0f0108ce34
|
Never load the history for default character
|
2023-06-06 07:00:11 -03:00 |
|
oobabooga
|
ae25b21d61
|
Improve instruct style in dark mode
|
2023-06-06 07:00:00 -03:00 |
|
matatonic
|
4a17a5db67
|
[extensions/openai] various fixes (#2533)
|
2023-06-06 01:43:04 -03:00 |
|
dependabot[bot]
|
97f3fa843f
|
Bump llama-cpp-python from 0.1.56 to 0.1.57 (#2537)
|
2023-06-05 23:45:58 -03:00 |
|
oobabooga
|
11f38b5c2b
|
Add AutoGPTQ LoRA support
|
2023-06-05 23:32:57 -03:00 |
|
oobabooga
|
3a5cfe96f0
|
Increase chat_prompt_size_max
|
2023-06-05 17:37:37 -03:00 |
|
oobabooga
|
4e9937aa99
|
Bump gradio
|
2023-06-05 17:29:21 -03:00 |
|
pandego
|
0377e385e0
|
Update .gitignore (#2504)
add .idea to git ignore
|
2023-06-05 17:11:03 -03:00 |
|
oobabooga
|
60bfd0b722
|
Merge pull request #2535 from oobabooga/dev
Dev branch merge
|
2023-06-05 17:07:54 -03:00 |
|