mirror of
https://github.com/ggerganov/llama.cpp.git
synced 2024-12-30 16:07:17 +01:00
d3ae0ee8d7
* py : fix requirements check '==' -> '~=' * cont : fix the fix * ci : run on all requirements.txt
6 lines
163 B
Plaintext
6 lines
163 B
Plaintext
-r ../../requirements/requirements-convert_legacy_llama.txt
|
|
--extra-index-url https://download.pytorch.org/whl/cpu
|
|
pillow~=10.2.0
|
|
torch~=2.2.1
|
|
torchvision~=0.17.1
|