Commit Graph

150 Commits

Author SHA1 Message Date
oobabooga
9ff6a538b6 Bump gradio version
Make sure to upgrade with

`pip install -r requirements.txt --upgrade`
2023-03-26 22:11:19 -03:00
Alex "mcmonkey" Goodwin
566898a79a initial lora training tab 2023-03-25 12:08:26 -07:00
oobabooga
7073e96093 Add back RWKV dependency #98 2023-03-19 12:05:28 -03:00
oobabooga
86b99006d9
Remove rwkv dependency 2023-03-18 10:27:52 -03:00
oobabooga
104293f411 Add LoRA support 2023-03-16 21:31:39 -03:00
oobabooga
23a5e886e1 The LLaMA PR has been merged into transformers
https://github.com/huggingface/transformers/pull/21955

The tokenizer class has been changed from

"LLaMATokenizer"

to

"LlamaTokenizer"

It is necessary to edit this change in every tokenizer_config.json
that you had for LLaMA so far.
2023-03-16 11:18:32 -03:00
oobabooga
29b7c5ac0c Sort the requirements 2023-03-15 12:40:03 -03:00
oobabooga
693b53d957 Merge branch 'main' into HideLord-main 2023-03-15 12:08:56 -03:00
dependabot[bot]
02d407542c
Bump accelerate from 0.17.0 to 0.17.1
Bumps [accelerate](https://github.com/huggingface/accelerate) from 0.17.0 to 0.17.1.
- [Release notes](https://github.com/huggingface/accelerate/releases)
- [Commits](https://github.com/huggingface/accelerate/compare/v0.17.0...v0.17.1)

---
updated-dependencies:
- dependency-name: accelerate
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-03-14 01:40:42 +00:00
oobabooga
d685332c10
Merge pull request #307 from oobabooga/dependabot/pip/bitsandbytes-0.37.1
Bump bitsandbytes from 0.37.0 to 0.37.1
2023-03-13 22:39:59 -03:00
dependabot[bot]
df83088593
Bump bitsandbytes from 0.37.0 to 0.37.1
Bumps [bitsandbytes](https://github.com/TimDettmers/bitsandbytes) from 0.37.0 to 0.37.1.
- [Release notes](https://github.com/TimDettmers/bitsandbytes/releases)
- [Changelog](https://github.com/TimDettmers/bitsandbytes/blob/main/CHANGELOG.md)
- [Commits](https://github.com/TimDettmers/bitsandbytes/commits)

---
updated-dependencies:
- dependency-name: bitsandbytes
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-03-14 01:36:18 +00:00
dependabot[bot]
715c3ecba6
Bump rwkv from 0.3.1 to 0.4.2
Bumps [rwkv](https://github.com/BlinkDL/ChatRWKV) from 0.3.1 to 0.4.2.
- [Release notes](https://github.com/BlinkDL/ChatRWKV/releases)
- [Commits](https://github.com/BlinkDL/ChatRWKV/commits)

---
updated-dependencies:
- dependency-name: rwkv
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-03-14 01:36:02 +00:00
Alexander Hristov Hristov
63c5a139a2
Merge branch 'main' into main 2023-03-13 19:50:08 +02:00
Luis Cosio
435a69e357
Fix for issue #282
RuntimeError: Tensors must have same number of dimensions: got 3 and 4
2023-03-13 11:41:35 -06:00
HideLord
683556f411 Adding markdown support and slight refactoring. 2023-03-12 21:34:09 +02:00
oobabooga
441e993c51 Bump accelerate, RWKV and safetensors 2023-03-12 14:25:14 -03:00
oobabooga
3c25557ef0 Add tqdm to requirements.txt 2023-03-12 08:48:16 -03:00
oobabooga
501afbc234 Add requests to requirements.txt 2023-03-11 14:47:30 -03:00
oobabooga
fd540b8930 Use new LLaMA implementation (this will break stuff. I am sorry)
https://github.com/oobabooga/text-generation-webui/wiki/LLaMA-model
2023-03-09 17:59:15 -03:00
oobabooga
8660227e1b Add top_k to RWKV 2023-03-07 17:24:28 -03:00
oobabooga
153dfeb4dd Add --rwkv-cuda-on parameter, bump rwkv version 2023-03-06 20:12:54 -03:00
oobabooga
145c725c39 Bump RWKV version 2023-03-05 16:28:21 -03:00
oobabooga
5492e2e9f8 Add sentencepiece 2023-03-05 10:02:24 -03:00
oobabooga
c33715ad5b Move towards HF LLaMA implementation 2023-03-05 01:20:31 -03:00
oobabooga
bcea196c9d Bump flexgen version 2023-03-02 12:03:57 -03:00
oobabooga
7a9b4407b0 Settle for 0.0.6 for now 2023-03-01 17:37:14 -03:00
oobabooga
f351dce032 Keep rwkv up to date 2023-03-01 17:36:16 -03:00
oobabooga
9c86a1cd4a Add RWKV pip package 2023-03-01 11:42:49 -03:00
oobabooga
b16f097466 Add FlexGen to requirements.txt 2023-02-27 08:58:07 -03:00
oobabooga
4548227fb5 Downgrade gradio version (file uploads are broken in 3.19.1) 2023-02-25 22:59:02 -03:00
oobabooga
32f40f3b42 Bump gradio version to 3.19.1 2023-02-25 17:20:03 -03:00
oobabooga
fe1771157f Properly scrape huggingface for download links (for #122) 2023-02-24 14:06:42 -03:00
oobabooga
55bb5e5ef0 Bump accelerate version 2023-02-18 22:15:47 -03:00
oobabooga
a55e8836f6 Bump gradio version
It looks uglier, but the old one was bugged and unstable.
2023-02-15 20:20:56 -03:00
oobabooga
3277b751f5 Add softprompt support (for real this time)
Is this too much voodoo for our purposes?
2023-02-13 15:25:16 -03:00
oobabooga
b4fc8dfa8f Add safetensors version 2023-02-04 18:58:17 -03:00
oobabooga
3dbebe30b1 Remove deepspeed requirement (only works on Linux for now) 2023-02-03 20:07:13 -03:00
oobabooga
03f084f311 Add safetensors support 2023-02-03 18:36:32 -03:00
oobabooga
03ebfba0fb Bump bitsandbytes version 2023-02-03 09:29:31 -03:00
oobabooga
224be31a74 Use main bs4 package 2023-02-02 12:20:58 -03:00
oobabooga
cecaebc291 Add bs4 requirement (fixes #47) 2023-02-02 12:18:32 -03:00
oobabooga
d6b2d68527 Remove redundant requirements 2023-02-02 10:40:09 -03:00
81300
a97afa6965
Add DeepSpeed ZeRO-3 integration 2023-02-01 18:48:13 +02:00
oobabooga
414fa9d161
Revert transformers version (gpt-j and opt are broken) 2023-01-26 23:01:13 -03:00
Silver267
ad191e295b
bump transformers
idk but there seems to be a lot of fixes in the new version, and it's working according to my local tests.
2023-01-25 23:22:20 -05:00
oobabooga
fc73188ec7 Allow specifying your own profile picture in chat mode 2023-01-25 19:37:44 -03:00
oobabooga
3fa14befc5 Bump the gradio version, add back the queue 2023-01-25 16:10:35 -03:00
oobabooga
4067cecf67 Bump bitsandbytes version 2023-01-20 12:51:49 -03:00
oobabooga
fed7233ff4 Add script to download models 2023-01-06 19:57:31 -03:00
oobabooga
874cd6ff3f Add the requirements.txt file (duh) 2023-01-06 00:25:33 -03:00