oobabooga
170e0c05c4
Typo
2023-04-09 17:00:59 -03:00
oobabooga
34ec02d41d
Make download-model.py importable
2023-04-09 16:59:59 -03:00
Blake Wyatt
df561fd896
Fix ggml downloading in download-model.py ( #915 )
2023-04-08 18:52:30 -03:00
oobabooga
ea6e77df72
Make the code more like PEP8 for readability ( #862 )
2023-04-07 00:15:45 -03:00
oobabooga
b38ba230f4
Update download-model.py
2023-04-01 15:03:24 -03:00
oobabooga
526d5725db
Update download-model.py
2023-04-01 14:47:47 -03:00
oobabooga
23116b88ef
Add support for resuming downloads ( #654 from nikita-skakun/support-partial-downloads)
2023-03-31 22:55:55 -03:00
oobabooga
74462ac713
Don't override the metadata when checking the sha256sum
2023-03-31 22:52:52 -03:00
oobabooga
875de5d983
Update ggml template
2023-03-31 17:57:31 -03:00
oobabooga
6a44f4aec6
Add support for downloading ggml files
2023-03-31 17:33:42 -03:00
Nikita Skakun
b99bea3c69
Fixed reported header affecting resuming download
2023-03-30 23:11:59 -07:00
oobabooga
92c7068daf
Don't download if --check is specified
2023-03-31 01:31:47 -03:00
Nikita Skakun
0cc89e7755
Checksum code now activated by --check flag.
2023-03-30 20:06:12 -07:00
Nikita Skakun
d550c12a3e
Fixed the bug with additional bytes.
...
The issue seems to be with huggingface not reporting the entire size of the model.
Added an error message with instructions if the checksums don't match.
2023-03-30 12:52:16 -07:00
Nikita Skakun
297ac051d9
Added sha256 validation of model files.
2023-03-30 02:34:19 -07:00
Nikita Skakun
8c590c2362
Added a 'clean' flag to not resume download.
2023-03-30 00:42:19 -07:00
Nikita Skakun
e17af59261
Add support for resuming downloads
...
This commit adds the ability to resume interrupted downloads by adding a new function to the downloader module. The function uses the HTTP Range header to fetch only the remaining part of a file that wasn't downloaded yet.
2023-03-30 00:21:34 -07:00
oobabooga
131753fcf5
Save the sha256sum of downloaded models
2023-03-29 23:28:16 -03:00
oobabooga
0345e04249
Fix "Unknown argument(s): {'verbose': False}"
2023-03-29 21:17:48 -03:00
oobabooga
37754164eb
Move argparse
2023-03-29 20:47:36 -03:00
oobabooga
6403e72062
Merge branch 'main' into nikita-skakun-optimize-download-model
2023-03-29 20:45:33 -03:00
oobabooga
1445ea86f7
Add --output and better metadata for downloading models
2023-03-29 20:26:44 -03:00
Nikita Skakun
aaa218a102
Remove unused import.
2023-03-28 18:32:49 -07:00
Nikita Skakun
ff515ec2fe
Improve progress bar visual style
...
This commit reverts the performance improvements of the previous commit for for improved visual style of multithreaded progress bars. The style of the progress bar has been modified to take up the same amount of size to align them.
2023-03-28 18:29:20 -07:00
Nikita Skakun
4d8e101006
Refactor download process to use multiprocessing
...
The previous implementation used threads to download files in parallel, which could lead to performance issues due to the Global Interpreter Lock (GIL).
This commit refactors the download process to use multiprocessing instead,
which allows for true parallelism across multiple CPUs.
This results in significantly faster downloads, particularly for large models.
2023-03-28 14:24:23 -07:00
oobabooga
91aa5b460e
If both .pt and .safetensors are present, download only safetensors
2023-03-28 13:08:38 -03:00
Florian Kusche
19174842b8
Also download Markdown files
2023-03-26 19:41:14 +02:00
oobabooga
bb4cb22453
Download .pt files using download-model.py (for 4-bit models)
2023-03-24 00:49:04 -03:00
oobabooga
164e05daad
Download .py files using download-model.py
2023-03-19 20:34:52 -03:00
oobabooga
104293f411
Add LoRA support
2023-03-16 21:31:39 -03:00
oobabooga
1d7e893fa1
Merge pull request #211 from zoidbb/add-tokenizer-to-hf-downloads
...
download tokenizer when present
2023-03-10 00:46:21 -03:00
oobabooga
875847bf88
Consider tokenizer a type of text
2023-03-10 00:45:28 -03:00
oobabooga
249c268176
Fix the download script for long lists of files on HF
2023-03-10 00:41:10 -03:00
Ber Zoidberg
ec3de0495c
download tokenizer when present
2023-03-09 19:08:09 -08:00
oobabooga
7c70e0e2a6
Fix the download script (sort of)
2023-03-02 14:05:21 -03:00
oobabooga
fe1771157f
Properly scrape huggingface for download links (for #122 )
2023-02-24 14:06:42 -03:00
oobabooga
bb1dac2f76
Convert the download option (A-Z) to upper case
2023-02-20 15:50:48 -03:00
oobabooga
fd8070b960
Give some default options in the download script
2023-02-16 23:04:13 -03:00
oobabooga
bf9dd8f8ee
Add --text-only option to the download script
2023-02-12 00:42:56 -03:00
oobabooga
66862203fc
Only download safetensors if both pytorch and safetensors are present
2023-02-12 00:06:22 -03:00
oobabooga
219366342b
Sort imports according to PEP8 (based on #67 )
2023-02-10 15:40:03 -03:00
oobabooga
9215e281ba
Add --threads option to the download script
2023-02-03 18:57:12 -03:00
oobabooga
03f084f311
Add safetensors support
2023-02-03 18:36:32 -03:00
oobabooga
1e541d4882
Update download-model.py
2023-01-21 00:43:00 -03:00
oobabooga
18ef72d7c0
Update download-model.py
2023-01-21 00:38:23 -03:00
81300
fffd49e64e
Add --branch option to the model download script
2023-01-20 22:51:56 +02:00
oobabooga
6456777b09
Clean things up
2023-01-16 16:35:45 -03:00
oobabooga
fcda5d7107
Fix the download script on windows ( #6 )
2023-01-13 09:05:21 -03:00
oobabooga
dd1bed2d8b
Fix the download script
2023-01-07 16:49:21 -03:00
oobabooga
5345685ead
Make paths cross-platform (should work on Windows now)
2023-01-07 16:33:43 -03:00
oobabooga
fed7233ff4
Add script to download models
2023-01-06 19:57:31 -03:00