Unsure if running native or container? #82
Replies: 2 comments
-
On macOS native we don't use containers because containers don't exist in macOS, we should document this somewhere... What you are running above doesn't use podman machine, yes it's using python3 on the host system, which is fine, that's by design 😄 But... We can still pull models as OCI artefacts on macOS... So one can use container transport mechanisms... We will also will have GPU acceleration using @slp 's mesa patches and podman-machine on macOS soon We also plan on adding a --no-container option on Linux if someone wants to run outside a container... I'd recommend most should run within a container, keeping your base system clean and use a tested runtime environment via a container... The Containerfile isn't so complex now, but once we work on the auto-detect GPU stuff, it will get more complex... |
Beta Was this translation helpful? Give feedback.
-
Yes the goal on MAC is to use containers if podman and podman-machine are present. (Or Docker and Docker-machine) but if not, then it will run natively. |
Beta Was this translation helpful? Give feedback.
-
I purposely have podman machine not running, however ramalama still works?
i'm confused in the docs if this is using native python on the host system or still being ran through a container?
Beta Was this translation helpful? Give feedback.
All reactions