mirror of
https://github.com/ggerganov/llama.cpp.git
synced 2024-10-30 06:30:15 +01:00
3c0b585561
* swiftui: support load model from file picker * swiftui: remove trailing whitespace |
||
---|---|---|
.. | ||
llama.cpp.swift | ||
llama.swiftui | ||
llama.swiftui.xcodeproj | ||
.gitignore | ||
README.md |
llama.swiftui
Local inference of llama.cpp on an iPhone. So far I only tested with starcoder 1B model, but it can most likely handle 7B models as well.
https://github.com/bachittle/llama.cpp/assets/39804642/e290827a-4edb-4093-9642-2a5e399ec545