Hello, I am looking to use the ollama container run LLMs using podman. I have intalled nvidia container toolkit and it seems to only have access to the GPU if I add --security-opt=label=disable to the podman run command. Without it, I get the following error: Failed to initialize NVML: Insufficient Permissions.
How can I go about using my GPU with containers without disabling selinux? Any policy I have to install, or selinux configuration I need to change?
Not familiar with selinux, so I appreciate any guidance provided. Thank you!