Commit Graph

2329 Commits

Author SHA1 Message Date
oobabooga
60a3e70242 Update LLaMA links and info 2023-07-17 12:51:01 -07:00
oobabooga
f83fdb9270 Don't reset LoRA menu when loading a model 2023-07-17 12:50:25 -07:00
oobabooga
4ce766414b Bump AutoGPTQ version 2023-07-17 10:02:12 -07:00
oobabooga
b1a6ea68dd Disable "autoload the model" by default 2023-07-17 07:40:56 -07:00
oobabooga
656b457795 Add Airoboros-v1.2 template 2023-07-17 07:27:42 -07:00
oobabooga
a199f21799 Optimize llamacpp_hf a bit 2023-07-16 20:49:48 -07:00
oobabooga
9f08038864
Merge pull request #3163 from oobabooga/dev
v1.2
2023-07-16 02:43:18 -03:00
oobabooga
6a3edb0542 Clean up llamacpp_hf.py 2023-07-15 22:40:55 -07:00
oobabooga
2de0cedce3 Fix reload screen color 2023-07-15 22:39:39 -07:00
oobabooga
13449aa44d Decrease download timeout 2023-07-15 22:30:08 -07:00
oobabooga
27a84b4e04 Make AutoGPTQ the default again
Purely for compatibility with more models.
You should still use ExLlama_HF for LLaMA models.
2023-07-15 22:29:23 -07:00
oobabooga
5e3f7e00a9
Create llamacpp_HF loader (#3062) 2023-07-16 02:21:13 -03:00
Panchovix
7c4d4fc7d3
Increase alpha value limit for NTK RoPE scaling for exllama/exllama_HF (#3149) 2023-07-16 01:56:04 -03:00
ofirkris
780a2f2e16
Bump llama cpp version (#3160)
Bump llama cpp version to support better 8K RoPE scaling

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-07-16 01:54:56 -03:00
jllllll
ed3ffd212d
Bump bitsandbytes to 0.40.1.post1 (#3156)
817bdf6325...6ec4f0c374
2023-07-16 01:53:32 -03:00
oobabooga
94dfcec237
Make it possible to evaluate exllama perplexity (#3138) 2023-07-16 01:52:55 -03:00
oobabooga
b284f2407d Make ExLlama_HF the new default for GPTQ 2023-07-14 14:03:56 -07:00
jllllll
32f12b8bbf
Bump bitsandbytes Windows wheel to 0.40.0.post4 (#3135) 2023-07-13 17:32:37 -03:00
SeanScripts
9800745db9
Color tokens by probability and/or perplexity (#3078) 2023-07-13 17:30:22 -03:00
oobabooga
146e8b2a6c Merge remote-tracking branch 'refs/remotes/origin/dev' into dev 2023-07-13 13:23:38 -07:00
Morgan Schweers
6d1e911577
Add support for logits processors in extensions (#3029) 2023-07-13 17:22:41 -03:00
oobabooga
22341e948d Merge branch 'main' into dev 2023-07-12 14:19:49 -07:00
oobabooga
0e6295886d Fix lora download folder 2023-07-12 14:19:33 -07:00
oobabooga
eb823fce96 Fix typo 2023-07-12 13:55:19 -07:00
oobabooga
d0a626f32f Change reload screen color 2023-07-12 13:54:43 -07:00
oobabooga
c592a9b740 Fix #3117 2023-07-12 13:33:44 -07:00
oobabooga
6447b2eea6
Merge pull request #3116 from oobabooga/dev
v1.1
2023-07-12 15:55:40 -03:00
oobabooga
2463d7c098 Spaces 2023-07-12 11:35:43 -07:00
oobabooga
e202190c4f lint 2023-07-12 11:33:25 -07:00
FartyPants
9b55d3a9f9
More robust and error prone training (#3058) 2023-07-12 15:29:43 -03:00
oobabooga
30f37530d5 Add back .replace('\r', '') 2023-07-12 09:52:20 -07:00
Fernando Tarin Morales
987d0fe023
Fix: Fixed the tokenization process of a raw dataset and improved its efficiency (#3035) 2023-07-12 12:05:37 -03:00
kabachuha
3f19e94c93
Add Tensorboard/Weights and biases integration for training (#2624) 2023-07-12 11:53:31 -03:00
kizinfo
5d513eea22
Add ability to load all text files from a subdirectory for training (#1997)
* Update utils.py

returns individual txt files and subdirectories to getdatasets to allow for training from a directory of text files

* Update training.py

minor tweak to training on raw datasets to detect if a directory is selected, and if so, to load in all the txt files in that directory for training

* Update put-trainer-datasets-here.txt

document

* Minor change

* Use pathlib, sort by natural keys

* Space

---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-07-12 11:44:30 -03:00
practicaldreamer
73a0def4af
Add Feature to Log Sample of Training Dataset for Inspection (#1711) 2023-07-12 11:26:45 -03:00
oobabooga
b6ba68eda9 Merge remote-tracking branch 'refs/remotes/origin/dev' into dev 2023-07-12 07:19:34 -07:00
oobabooga
a17b78d334 Disable wandb during training 2023-07-12 07:19:12 -07:00
Gabriel Pena
eedb3bf023
Add low vram mode on llama cpp (#3076) 2023-07-12 11:05:13 -03:00
oobabooga
180420d2c9 Fix send_pictures extension 2023-07-11 20:56:01 -07:00
original-subliminal-thought-criminal
ad07839a7b
Small bug, when arbitrary loading character.json that doesn't exist (#2643)
* Fixes #2482

* corrected erroronius variable

* Use .exists()

---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-07-12 00:16:36 -03:00
Axiom Wolf
d986c17c52
Chat history download creates more detailed file names (#3051) 2023-07-12 00:10:36 -03:00
atriantafy
d9fabdde40
Add context_instruct to API. Load default model instruction template … (#2688) 2023-07-12 00:01:03 -03:00
Salvador E. Tropea
324e45b848
[Fixed] wbits and groupsize values from model not shown (#2977) 2023-07-11 23:27:38 -03:00
oobabooga
e3810dff40 Style changes 2023-07-11 18:49:06 -07:00
oobabooga
bfafd07f44 Change a message 2023-07-11 18:29:20 -07:00
oobabooga
a12dae51b9 Bump bitsandbytes 2023-07-11 18:29:08 -07:00
Keith Kjer
37bffb2e1a
Add reference to new pipeline in multimodal readme (#2947) 2023-07-11 19:04:15 -03:00
Juliano Henriquez
1fc0b5041e
substitu superboog Beatiful Soup Parser (#2996)
* add lxml to requirments

add lxml to requirments

* Change Beaitful Soup Parser

"lxml" parser which might be more tolerant of certain kinds of parsing errors than "html.parser" and quicker at the same time.
2023-07-11 19:02:49 -03:00
Salvador E. Tropea
ab044a5a44
Elevenlabs tts fixes (#2959)
* [Fixed] Keep setting option for the voice

- It was always changed to the first available voice
- Also added an error if the selected voice isn't valid

* [Fixed] elevenlabs_tts API key handling

- The one from the settings wasn't applied
- We always got "Enter your API key", even when the settings specified
  an api_key

* [Added] elevenlabs_tts model selection

- Now we can also use the "eleven_multilingual_v1" model.
  Used for anything but english.
2023-07-11 19:00:37 -03:00
micsthepick
3708de2b1f
respect model dir for downloads (#3077) (#3079) 2023-07-11 18:55:46 -03:00