oobabooga
|
512b311137
|
Improve the llama-cpp-python exception messages
|
2024-07-11 13:00:29 -07:00 |
|
oobabooga
|
aa653e3b5a
|
Prevent llama.cpp from being monkey patched more than once (closes #6201)
|
2024-07-05 03:34:15 -07:00 |
|
oobabooga
|
a47de06088
|
Force only 1 llama-cpp-python version at a time for now
|
2024-07-04 19:43:34 -07:00 |
|
oobabooga
|
f243b4ca9c
|
Make llama-cpp-python not crash immediately
|
2024-07-04 19:16:00 -07:00 |
|
oobabooga
|
51fb766bea
|
Add back my llama-cpp-python wheels, bump to 0.2.65 (#5964)
|
2024-04-30 09:11:31 -03:00 |
|
oobabooga
|
9b623b8a78
|
Bump llama-cpp-python to 0.2.64, use official wheels (#5921)
|
2024-04-23 23:17:05 -03:00 |
|
oobabooga
|
3e3a7c4250
|
Bump llama-cpp-python to 0.2.61 & fix the crash
|
2024-04-11 14:15:34 -07:00 |
|
oobabooga
|
afb51bd5d6
|
Add StreamingLLM for llamacpp & llamacpp_HF (2nd attempt) (#5669)
|
2024-03-09 00:25:33 -03:00 |
|
oobabooga
|
069ed7c6ef
|
Lint
|
2024-02-13 16:05:41 -08:00 |
|
oobabooga
|
86c320ab5a
|
llama.cpp: add a progress bar for prompt evaluation
|
2024-02-07 21:56:10 -08:00 |
|