oobabooga
39099663a0
Add 4-bit LoRA support ( #1200 )
2023-04-16 23:26:52 -03:00
dependabot[bot]
4cd2a9d824
Bump transformers from 4.28.0 to 4.28.1 ( #1288 )
2023-04-16 21:12:57 -03:00
oobabooga
d2ea925fa5
Bump llama-cpp-python to use LlamaCache
2023-04-16 00:53:40 -03:00
catalpaaa
94700cc7a5
Bump gradio to 3.25 ( #1089 )
2023-04-14 23:45:25 -03:00
Alex "mcmonkey" Goodwin
64e3b44e0f
initial multi-lora support ( #1103 )
...
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-04-14 14:52:06 -03:00
dependabot[bot]
852a5aa13d
Bump bitsandbytes from 0.37.2 to 0.38.1 ( #1158 )
2023-04-13 21:23:14 -03:00
dependabot[bot]
84576a80d2
Bump llama-cpp-python from 0.1.30 to 0.1.33 ( #1157 )
2023-04-13 21:17:59 -03:00
oobabooga
2908a51587
Settle for transformers 4.28.0
2023-04-13 21:07:00 -03:00
oobabooga
32d078487e
Add llama-cpp-python to requirements.txt
2023-04-10 10:45:51 -03:00
oobabooga
d272ac46dd
Add Pillow as a requirement
2023-04-08 18:48:46 -03:00
oobabooga
58ed87e5d9
Update requirements.txt
2023-04-06 18:42:54 -03:00
dependabot[bot]
21be80242e
Bump rwkv from 0.7.2 to 0.7.3 ( #842 )
2023-04-06 17:52:27 -03:00
oobabooga
113f94b61e
Bump transformers (16-bit llama must be reconverted/redownloaded)
2023-04-06 16:04:03 -03:00
oobabooga
59058576b5
Remove unused requirement
2023-04-06 13:28:21 -03:00
oobabooga
03cb44fc8c
Add new llama.cpp library (2048 context, temperature, etc now work)
2023-04-06 13:12:14 -03:00
oobabooga
b2ce7282a1
Use past transformers version #773
2023-04-04 16:11:42 -03:00
dependabot[bot]
ad37f396fc
Bump rwkv from 0.7.1 to 0.7.2 ( #747 )
2023-04-03 14:29:57 -03:00
dependabot[bot]
18f756ada6
Bump gradio from 3.24.0 to 3.24.1 ( #746 )
2023-04-03 14:29:37 -03:00
TheTerrasque
2157bb4319
New yaml character format ( #337 from TheTerrasque/feature/yaml-characters)
...
This doesn't break backward compatibility with JSON characters.
2023-04-02 20:34:25 -03:00
oobabooga
a5c9b7d977
Bump llamacpp version
2023-03-31 15:08:01 -03:00
oobabooga
4d98623041
Merge branch 'main' into feature/llamacpp
2023-03-31 14:37:04 -03:00
oobabooga
9d1dcf880a
General improvements
2023-03-31 14:27:01 -03:00
oobabooga
f27a66b014
Bump gradio version (make sure to update)
...
This fixes the textbox shrinking vertically once it reaches
a certain number of lines.
2023-03-31 00:42:26 -03:00
Thomas Antony
8953a262cb
Add llamacpp to requirements.txt
2023-03-30 11:22:38 +01:00
Alex "mcmonkey" Goodwin
b0f05046b3
remove duplicate import
2023-03-27 22:50:37 -07:00
Alex "mcmonkey" Goodwin
31f04dc615
Merge branch 'main' into add-train-lora-tab
2023-03-27 20:03:30 -07:00
dependabot[bot]
1e02f75f2b
Bump accelerate from 0.17.1 to 0.18.0
...
Bumps [accelerate](https://github.com/huggingface/accelerate ) from 0.17.1 to 0.18.0.
- [Release notes](https://github.com/huggingface/accelerate/releases )
- [Commits](https://github.com/huggingface/accelerate/compare/v0.17.1...v0.18.0 )
---
updated-dependencies:
- dependency-name: accelerate
dependency-type: direct:production
update-type: version-update:semver-minor
...
Signed-off-by: dependabot[bot] <support@github.com>
2023-03-28 01:19:34 +00:00
oobabooga
37f11803e3
Merge pull request #603 from oobabooga/dependabot/pip/rwkv-0.7.1
...
Bump rwkv from 0.7.0 to 0.7.1
2023-03-27 22:19:08 -03:00
dependabot[bot]
e9c0226b09
Bump rwkv from 0.7.0 to 0.7.1
...
Bumps [rwkv](https://github.com/BlinkDL/ChatRWKV ) from 0.7.0 to 0.7.1.
- [Release notes](https://github.com/BlinkDL/ChatRWKV/releases )
- [Commits](https://github.com/BlinkDL/ChatRWKV/commits )
---
updated-dependencies:
- dependency-name: rwkv
dependency-type: direct:production
update-type: version-update:semver-patch
...
Signed-off-by: dependabot[bot] <support@github.com>
2023-03-27 21:05:35 +00:00
dependabot[bot]
9c96919121
Bump bitsandbytes from 0.37.1 to 0.37.2
...
Bumps [bitsandbytes](https://github.com/TimDettmers/bitsandbytes ) from 0.37.1 to 0.37.2.
- [Release notes](https://github.com/TimDettmers/bitsandbytes/releases )
- [Changelog](https://github.com/TimDettmers/bitsandbytes/blob/main/CHANGELOG.md )
- [Commits](https://github.com/TimDettmers/bitsandbytes/commits )
---
updated-dependencies:
- dependency-name: bitsandbytes
dependency-type: direct:production
update-type: version-update:semver-patch
...
Signed-off-by: dependabot[bot] <support@github.com>
2023-03-27 21:05:19 +00:00
Alex "mcmonkey" Goodwin
e439228ed8
Merge branch 'main' into add-train-lora-tab
2023-03-27 08:21:19 -07:00
oobabooga
9ff6a538b6
Bump gradio version
...
Make sure to upgrade with
`pip install -r requirements.txt --upgrade`
2023-03-26 22:11:19 -03:00
Alex "mcmonkey" Goodwin
566898a79a
initial lora training tab
2023-03-25 12:08:26 -07:00
oobabooga
7073e96093
Add back RWKV dependency #98
2023-03-19 12:05:28 -03:00
oobabooga
86b99006d9
Remove rwkv dependency
2023-03-18 10:27:52 -03:00
oobabooga
104293f411
Add LoRA support
2023-03-16 21:31:39 -03:00
oobabooga
23a5e886e1
The LLaMA PR has been merged into transformers
...
https://github.com/huggingface/transformers/pull/21955
The tokenizer class has been changed from
"LLaMATokenizer"
to
"LlamaTokenizer"
It is necessary to edit this change in every tokenizer_config.json
that you had for LLaMA so far.
2023-03-16 11:18:32 -03:00
oobabooga
29b7c5ac0c
Sort the requirements
2023-03-15 12:40:03 -03:00
oobabooga
693b53d957
Merge branch 'main' into HideLord-main
2023-03-15 12:08:56 -03:00
dependabot[bot]
02d407542c
Bump accelerate from 0.17.0 to 0.17.1
...
Bumps [accelerate](https://github.com/huggingface/accelerate ) from 0.17.0 to 0.17.1.
- [Release notes](https://github.com/huggingface/accelerate/releases )
- [Commits](https://github.com/huggingface/accelerate/compare/v0.17.0...v0.17.1 )
---
updated-dependencies:
- dependency-name: accelerate
dependency-type: direct:production
update-type: version-update:semver-patch
...
Signed-off-by: dependabot[bot] <support@github.com>
2023-03-14 01:40:42 +00:00
oobabooga
d685332c10
Merge pull request #307 from oobabooga/dependabot/pip/bitsandbytes-0.37.1
...
Bump bitsandbytes from 0.37.0 to 0.37.1
2023-03-13 22:39:59 -03:00
dependabot[bot]
df83088593
Bump bitsandbytes from 0.37.0 to 0.37.1
...
Bumps [bitsandbytes](https://github.com/TimDettmers/bitsandbytes ) from 0.37.0 to 0.37.1.
- [Release notes](https://github.com/TimDettmers/bitsandbytes/releases )
- [Changelog](https://github.com/TimDettmers/bitsandbytes/blob/main/CHANGELOG.md )
- [Commits](https://github.com/TimDettmers/bitsandbytes/commits )
---
updated-dependencies:
- dependency-name: bitsandbytes
dependency-type: direct:production
update-type: version-update:semver-patch
...
Signed-off-by: dependabot[bot] <support@github.com>
2023-03-14 01:36:18 +00:00
dependabot[bot]
715c3ecba6
Bump rwkv from 0.3.1 to 0.4.2
...
Bumps [rwkv](https://github.com/BlinkDL/ChatRWKV ) from 0.3.1 to 0.4.2.
- [Release notes](https://github.com/BlinkDL/ChatRWKV/releases )
- [Commits](https://github.com/BlinkDL/ChatRWKV/commits )
---
updated-dependencies:
- dependency-name: rwkv
dependency-type: direct:production
update-type: version-update:semver-minor
...
Signed-off-by: dependabot[bot] <support@github.com>
2023-03-14 01:36:02 +00:00
Alexander Hristov Hristov
63c5a139a2
Merge branch 'main' into main
2023-03-13 19:50:08 +02:00
Luis Cosio
435a69e357
Fix for issue #282
...
RuntimeError: Tensors must have same number of dimensions: got 3 and 4
2023-03-13 11:41:35 -06:00
HideLord
683556f411
Adding markdown support and slight refactoring.
2023-03-12 21:34:09 +02:00
oobabooga
441e993c51
Bump accelerate, RWKV and safetensors
2023-03-12 14:25:14 -03:00
oobabooga
3c25557ef0
Add tqdm to requirements.txt
2023-03-12 08:48:16 -03:00
oobabooga
501afbc234
Add requests to requirements.txt
2023-03-11 14:47:30 -03:00
oobabooga
fd540b8930
Use new LLaMA implementation (this will break stuff. I am sorry)
...
https://github.com/oobabooga/text-generation-webui/wiki/LLaMA-model
2023-03-09 17:59:15 -03:00