This website requires JavaScript.
Explore
Help
Register
Sign In
Mirrors
/
llama.cpp
Watch
1
Star
0
Fork
0
You've already forked llama.cpp
mirror of
https://github.com/ggerganov/llama.cpp.git
synced
2025-01-01 00:39:00 +01:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
build-metal-default
llama.cpp
/
examples
/
perplexity
History
Georgi Gerganov
99161230c4
llama : enable GPU inference by default with Metal
2023-09-03 10:30:53 +03:00
..
CMakeLists.txt
cmake : install targets (
#2256
)
2023-07-19 10:01:11 +03:00
perplexity.cpp
llama : enable GPU inference by default with Metal
2023-09-03 10:30:53 +03:00
README.md
Fix whitespace, add .editorconfig, add GitHub workflow (
#883
)
2023-04-11 19:45:44 +00:00
README.md
perplexity
TODO