Commit Graph

3595 Commits

Author SHA1 Message Date
dependabot[bot]
af98587580
Update accelerate requirement from ==0.23.* to ==0.24.* (#4400) 2023-10-27 00:46:16 -03:00
oobabooga
839a87bac8 Fix is_ccl_available & is_xpu_available imports 2023-10-26 20:27:04 -07:00
Abhilash Majumder
778a010df8
Intel Gpu support initialization (#4340) 2023-10-26 23:39:51 -03:00
GuizzyQC
317e2c857e
sd_api_pictures: fix Gradio warning message regarding custom value (#4391) 2023-10-26 23:03:21 -03:00
oobabooga
92b2f57095 Minor metadata bug fix (second attempt) 2023-10-26 18:57:32 -07:00
oobabooga
2d97897a25 Don't install flash-attention on windows + cuda 11 2023-10-25 11:21:18 -07:00
LightningDragon
0ced78fdfa
Replace hashlib.sha256 with hashlib.file_digest so we don't need to load entire files into ram before hashing them. (#4383) 2023-10-25 12:15:34 -03:00
tdrussell
72f6fc6923
Rename additive_repetition_penalty to presence_penalty, add frequency_penalty (#4376) 2023-10-25 12:10:28 -03:00
oobabooga
ef1489cd4d Remove unused parameter in AutoAWQ 2023-10-23 20:45:43 -07:00
oobabooga
1edf321362 Lint 2023-10-23 13:09:03 -07:00
oobabooga
280ae720d7 Organize 2023-10-23 13:07:17 -07:00
oobabooga
49e5eecce4 Merge remote-tracking branch 'refs/remotes/origin/main' 2023-10-23 12:54:05 -07:00
oobabooga
82c11be067 Update 04 - Model Tab.md 2023-10-23 12:49:07 -07:00
oobabooga
306d764ff6 Minor metadata bug fix 2023-10-23 12:46:24 -07:00
adrianfiedler
4bc411332f
Fix broken links (#4367)
---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-10-23 14:09:57 -03:00
oobabooga
92691ee626 Disable trust_remote_code by default 2023-10-23 09:57:44 -07:00
tdrussell
4440f87722
Add additive_repetition_penalty sampler setting. (#3627) 2023-10-23 02:28:07 -03:00
oobabooga
6086768309 Bump gradio to 3.50.* 2023-10-22 21:21:26 -07:00
oobabooga
b8183148cf
Update 04 ‐ Model Tab.md 2023-10-22 17:15:55 -03:00
oobabooga
cea7fc2435 Update html_instruct_style.css 2023-10-22 12:28:23 -07:00
oobabooga
df90d03e0b Replace --mul_mat_q with --no_mul_mat_q 2023-10-22 12:23:03 -07:00
Googulator
d0c3b407b3
transformers loader: multi-LoRAs support (#3120) 2023-10-22 16:06:22 -03:00
omo
4405513ca5
Option to select/target additional linear modules/layers in LORA training (#4178) 2023-10-22 15:57:19 -03:00
oobabooga
7a3f885ea8
Update 03 ‐ Parameters Tab.md 2023-10-22 14:52:23 -03:00
oobabooga
63688004dc Add default cmd flags to colab 2023-10-22 09:56:43 -07:00
oobabooga
613feca23b Make colab functional for llama.cpp
- Download only Q4_K_M for GGUF repositories by default
- Use maximum n-gpu-layers by default
2023-10-22 09:08:25 -07:00
oobabooga
994502d41b Colab fixes 2023-10-22 08:57:16 -07:00
Jiashu Xu
c544f5cc51
Support LLaVA v1.5 7B (#4348) 2023-10-22 12:49:04 -03:00
oobabooga
05741821a5 Minor colab changes 2023-10-22 08:44:35 -07:00
FartyPants (FP HAM)
6a61158adf
Training PRO a month worth of updates (#4345) 2023-10-22 12:38:09 -03:00
mongolu
c18504f369
USE_CUDA118 from ENV remains null one_click.py + cuda-toolkit (#4352) 2023-10-22 12:37:24 -03:00
oobabooga
cd45635f53 tqdm improvement for colab 2023-10-21 22:00:29 -07:00
oobabooga
ae79c510cc Merge remote-tracking branch 'refs/remotes/origin/main' 2023-10-21 21:46:15 -07:00
oobabooga
2d1b3332e4 Ignore warnings on Colab 2023-10-21 21:45:25 -07:00
oobabooga
caf6db07ad
Update README.md 2023-10-22 01:22:17 -03:00
oobabooga
1a34927314 Make API URLs more visible 2023-10-21 21:11:07 -07:00
oobabooga
09f807af83 Use ExLlama_HF for GPTQ models by default 2023-10-21 20:45:38 -07:00
oobabooga
619093483e Add Colab notebook 2023-10-21 20:27:52 -07:00
oobabooga
506d05aede Organize command-line arguments 2023-10-21 18:52:59 -07:00
oobabooga
b1f33b55fd
Update 01 ‐ Chat Tab.md 2023-10-21 20:17:56 -03:00
oobabooga
ac6d5d50b7
Update README.md 2023-10-21 20:03:43 -03:00
oobabooga
6efb990b60
Add a proper documentation (#3885) 2023-10-21 19:15:54 -03:00
Adam White
5a5bc135e9
Docker: Remove explicit CUDA 11.8 Reference (#4343) 2023-10-21 15:09:34 -03:00
oobabooga
b98fbe0afc Add download link 2023-10-20 23:58:05 -07:00
oobabooga
fbac6d21ca Add missing exception 2023-10-20 23:53:24 -07:00
Brian Dashore
3345da2ea4
Add flash-attention 2 for windows (#4235) 2023-10-21 03:46:23 -03:00
oobabooga
258d046218 More robust way of initializing empty .git folder 2023-10-20 23:13:09 -07:00
Johan
1d5a015ce7
Enable special token support for exllamav2 (#4314) 2023-10-21 01:54:06 -03:00
mjbogusz
8f6405d2fa
Python 3.11, 3.9, 3.8 support (#4233)
---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-10-20 21:13:33 -03:00
oobabooga
9be74fb57c Change 2 margins 2023-10-20 14:04:14 -07:00