oobabooga
|
623c92792a
|
Update README
|
2023-12-14 07:56:48 -08:00 |
|
oobabooga
|
3580bed041
|
Update README
|
2023-12-14 07:54:20 -08:00 |
|
oobabooga
|
d5ec3c3444
|
Update README
|
2023-12-14 06:20:52 -08:00 |
|
oobabooga
|
5b283fff22
|
Update README
|
2023-12-14 06:15:14 -08:00 |
|
oobabooga
|
958799221f
|
Update README
|
2023-12-14 06:09:03 -08:00 |
|
oobabooga
|
e7fa17740a
|
Update README
|
2023-12-13 22:49:42 -08:00 |
|
oobabooga
|
03babe7d81
|
Update README
|
2023-12-13 22:47:08 -08:00 |
|
oobabooga
|
aad14174e4
|
Update README
|
2023-12-13 22:46:18 -08:00 |
|
oobabooga
|
783947a2aa
|
Update README
|
2023-12-13 22:44:25 -08:00 |
|
oobabooga
|
7fef16950f
|
Update README
|
2023-12-13 22:42:54 -08:00 |
|
oobabooga
|
d36e7f1762
|
Update README
|
2023-12-13 22:35:22 -08:00 |
|
oobabooga
|
9695db0ee4
|
Update README
|
2023-12-13 22:30:31 -08:00 |
|
oobabooga
|
d354f5009c
|
Update README
|
2023-12-13 22:21:29 -08:00 |
|
oobabooga
|
0a4fad2d46
|
Update README
|
2023-12-13 22:20:37 -08:00 |
|
oobabooga
|
fade6abfe9
|
Update README
|
2023-12-13 22:18:40 -08:00 |
|
oobabooga
|
aafd15109d
|
Update README
|
2023-12-13 22:15:58 -08:00 |
|
oobabooga
|
634518a412
|
Update README
|
2023-12-13 22:08:41 -08:00 |
|
oobabooga
|
0d5ca05ab9
|
Update README
|
2023-12-13 22:06:04 -08:00 |
|
oobabooga
|
d241de86c4
|
Update README
|
2023-12-13 22:02:26 -08:00 |
|
oobabooga
|
36e850fe89
|
Update README.md
|
2023-12-13 17:55:41 -03:00 |
|
oobabooga
|
8c8825b777
|
Add QuIP# to README
|
2023-12-08 08:40:42 -08:00 |
|
oobabooga
|
f7145544f9
|
Update README
|
2023-12-04 15:44:44 -08:00 |
|
oobabooga
|
be88b072e9
|
Update --loader flag description
|
2023-12-04 15:41:25 -08:00 |
|
Ikko Eltociear Ashimine
|
06cc9a85f7
|
README: minor typo fix (#4793)
|
2023-12-03 22:46:34 -03:00 |
|
oobabooga
|
000b77a17d
|
Minor docker changes
|
2023-11-29 21:27:23 -08:00 |
|
Callum
|
88620c6b39
|
feature/docker_improvements (#4768)
|
2023-11-30 02:20:23 -03:00 |
|
oobabooga
|
ff24648510
|
Credit llama-cpp-python in the README
|
2023-11-20 12:13:15 -08:00 |
|
oobabooga
|
ef6feedeb2
|
Add --nowebui flag for pure API mode (#4651)
|
2023-11-18 23:38:39 -03:00 |
|
oobabooga
|
8f4f4daf8b
|
Add --admin-key flag for API (#4649)
|
2023-11-18 22:33:27 -03:00 |
|
oobabooga
|
d1a58da52f
|
Update ancient Docker instructions
|
2023-11-17 19:52:53 -08:00 |
|
oobabooga
|
e0ca49ed9c
|
Bump llama-cpp-python to 0.2.18 (2nd attempt) (#4637)
* Update requirements*.txt
* Add back seed
|
2023-11-18 00:31:27 -03:00 |
|
oobabooga
|
9d6f79db74
|
Revert "Bump llama-cpp-python to 0.2.18 (#4611)"
This reverts commit 923c8e25fb .
|
2023-11-17 05:14:25 -08:00 |
|
oobabooga
|
13dc3b61da
|
Update README
|
2023-11-16 19:57:55 -08:00 |
|
oobabooga
|
923c8e25fb
|
Bump llama-cpp-python to 0.2.18 (#4611)
|
2023-11-16 22:55:14 -03:00 |
|
oobabooga
|
322c170566
|
Document logits_all
|
2023-11-07 14:45:11 -08:00 |
|
oobabooga
|
d59f1ad89a
|
Update README.md
|
2023-11-07 13:05:06 -03:00 |
|
oobabooga
|
ec17a5d2b7
|
Make OpenAI API the default API (#4430)
|
2023-11-06 02:38:29 -03:00 |
|
feng lui
|
4766a57352
|
transformers: add use_flash_attention_2 option (#4373)
|
2023-11-04 13:59:33 -03:00 |
|
oobabooga
|
c0655475ae
|
Add cache_8bit option
|
2023-11-02 11:23:04 -07:00 |
|
oobabooga
|
77abd9b69b
|
Add no_flash_attn option
|
2023-11-02 11:08:53 -07:00 |
|
adrianfiedler
|
4bc411332f
|
Fix broken links (#4367)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2023-10-23 14:09:57 -03:00 |
|
oobabooga
|
df90d03e0b
|
Replace --mul_mat_q with --no_mul_mat_q
|
2023-10-22 12:23:03 -07:00 |
|
oobabooga
|
caf6db07ad
|
Update README.md
|
2023-10-22 01:22:17 -03:00 |
|
oobabooga
|
506d05aede
|
Organize command-line arguments
|
2023-10-21 18:52:59 -07:00 |
|
oobabooga
|
ac6d5d50b7
|
Update README.md
|
2023-10-21 20:03:43 -03:00 |
|
oobabooga
|
6efb990b60
|
Add a proper documentation (#3885)
|
2023-10-21 19:15:54 -03:00 |
|
oobabooga
|
b98fbe0afc
|
Add download link
|
2023-10-20 23:58:05 -07:00 |
|
Brian Dashore
|
3345da2ea4
|
Add flash-attention 2 for windows (#4235)
|
2023-10-21 03:46:23 -03:00 |
|
mjbogusz
|
8f6405d2fa
|
Python 3.11, 3.9, 3.8 support (#4233)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2023-10-20 21:13:33 -03:00 |
|
oobabooga
|
43be1be598
|
Manually install CUDA runtime libraries
|
2023-10-12 21:02:44 -07:00 |
|