Commit Graph

490 Commits

Author SHA1 Message Date
Ikko Eltociear Ashimine
06cc9a85f7
README: minor typo fix (#4793) 2023-12-03 22:46:34 -03:00
oobabooga
000b77a17d Minor docker changes 2023-11-29 21:27:23 -08:00
Callum
88620c6b39
feature/docker_improvements (#4768) 2023-11-30 02:20:23 -03:00
oobabooga
ff24648510 Credit llama-cpp-python in the README 2023-11-20 12:13:15 -08:00
oobabooga
ef6feedeb2
Add --nowebui flag for pure API mode (#4651) 2023-11-18 23:38:39 -03:00
oobabooga
8f4f4daf8b
Add --admin-key flag for API (#4649) 2023-11-18 22:33:27 -03:00
oobabooga
d1a58da52f Update ancient Docker instructions 2023-11-17 19:52:53 -08:00
oobabooga
e0ca49ed9c
Bump llama-cpp-python to 0.2.18 (2nd attempt) (#4637)
* Update requirements*.txt

* Add back seed
2023-11-18 00:31:27 -03:00
oobabooga
9d6f79db74 Revert "Bump llama-cpp-python to 0.2.18 (#4611)"
This reverts commit 923c8e25fb.
2023-11-17 05:14:25 -08:00
oobabooga
13dc3b61da Update README 2023-11-16 19:57:55 -08:00
oobabooga
923c8e25fb
Bump llama-cpp-python to 0.2.18 (#4611) 2023-11-16 22:55:14 -03:00
oobabooga
322c170566 Document logits_all 2023-11-07 14:45:11 -08:00
oobabooga
d59f1ad89a
Update README.md 2023-11-07 13:05:06 -03:00
oobabooga
ec17a5d2b7
Make OpenAI API the default API (#4430) 2023-11-06 02:38:29 -03:00
feng lui
4766a57352
transformers: add use_flash_attention_2 option (#4373) 2023-11-04 13:59:33 -03:00
oobabooga
c0655475ae Add cache_8bit option 2023-11-02 11:23:04 -07:00
oobabooga
77abd9b69b Add no_flash_attn option 2023-11-02 11:08:53 -07:00
adrianfiedler
4bc411332f
Fix broken links (#4367)
---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-10-23 14:09:57 -03:00
oobabooga
df90d03e0b Replace --mul_mat_q with --no_mul_mat_q 2023-10-22 12:23:03 -07:00
oobabooga
caf6db07ad
Update README.md 2023-10-22 01:22:17 -03:00
oobabooga
506d05aede Organize command-line arguments 2023-10-21 18:52:59 -07:00
oobabooga
ac6d5d50b7
Update README.md 2023-10-21 20:03:43 -03:00
oobabooga
6efb990b60
Add a proper documentation (#3885) 2023-10-21 19:15:54 -03:00
oobabooga
b98fbe0afc Add download link 2023-10-20 23:58:05 -07:00
Brian Dashore
3345da2ea4
Add flash-attention 2 for windows (#4235) 2023-10-21 03:46:23 -03:00
mjbogusz
8f6405d2fa
Python 3.11, 3.9, 3.8 support (#4233)
---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-10-20 21:13:33 -03:00
oobabooga
43be1be598 Manually install CUDA runtime libraries 2023-10-12 21:02:44 -07:00
oobabooga
2e8b5f7c80
Update ROCm command 2023-10-08 10:12:13 -03:00
oobabooga
00187d641a
Note about pytorch 2.1 breaking change 2023-10-08 10:10:38 -03:00
oobabooga
1c6e57dd68
Note about pytorch 2.1 breaking change 2023-10-08 10:09:22 -03:00
oobabooga
d33facc9fe
Bump to pytorch 11.8 (#4209) 2023-10-07 00:23:49 -03:00
oobabooga
7ffb424c7b Add AutoAWQ to README 2023-10-05 09:22:37 -07:00
oobabooga
b6fe6acf88 Add threads_batch parameter 2023-10-01 21:28:00 -07:00
StoyanStAtanasov
7e6ff8d1f0
Enable NUMA feature for llama_cpp_python (#4040) 2023-09-26 22:05:00 -03:00
oobabooga
44438c60e5 Add INSTALL_EXTENSIONS environment variable 2023-09-25 13:12:35 -07:00
oobabooga
d0d221df49 Add --use_fast option (closes #3741) 2023-09-25 12:19:43 -07:00
oobabooga
2e7b6b0014
Create alternative requirements.txt with AMD and Metal wheels (#4052) 2023-09-24 09:58:29 -03:00
oobabooga
895ec9dadb
Update README.md 2023-09-23 15:37:39 -03:00
oobabooga
299d285ff0
Update README.md 2023-09-23 15:36:09 -03:00
oobabooga
4b4d283a4c
Update README.md 2023-09-23 00:09:59 -03:00
oobabooga
0581f1094b
Update README.md 2023-09-22 23:31:32 -03:00
oobabooga
968f98a57f
Update README.md 2023-09-22 23:23:16 -03:00
oobabooga
72b4ab4c82 Update README 2023-09-22 15:20:09 -07:00
oobabooga
589ee9f623
Update README.md 2023-09-22 16:21:48 -03:00
oobabooga
c33a94e381 Rename doc file 2023-09-22 12:17:47 -07:00
oobabooga
6c5f81f002 Rename webui.py to one_click.py 2023-09-22 12:00:06 -07:00
oobabooga
fe2acdf45f
Update README.md 2023-09-22 15:52:20 -03:00
oobabooga
193fe18c8c Resolve conflicts 2023-09-21 17:45:11 -07:00
oobabooga
df39f455ad Merge remote-tracking branch 'second-repo/main' into merge-second-repo 2023-09-21 17:39:54 -07:00
James Braza
fee38e0601
Simplified ExLlama cloning instructions and failure message (#3972) 2023-09-17 19:26:05 -03:00