*Hi, I have built an AI workstation with the extensive support of Gemini AI. The workstation is based on Fedora and a complete Intel CPU/GPU hardware platform:
System Details Report
Report details
Date generated: 2025-10-05 12:17:28
Hardware Information:
Hardware Model: Micro-Star International Co., Ltd. MS-7E34
Memory: 128.0 GiB
Processor: Intel® Core™ Ultra 7 265K × 20
Graphics: Intel® Graphics (ARL)
Graphics 1: Intel® Arc™ Pro B60 Graphics (BMG G21)
Disk Capacity: 12.1 TB
Software Information:
Firmware Version: 2.A10
OS Name: Fedora Linux 42 (Workstation Edition)
OS Build: (null)
OS Type: 64-bit
GNOME Version: 48
Windowing System: Wayland
Kernel Version: Linux 6.16.10-200.fc42.x86_64
I am trying to run a Quadlet container. The associated Ollama container keeps crashing or is not recognised by systemd. The content of the ollama.container file is as follows:
GNU nano 8.3 /home/ashleysheardown/.config/containers/systemd/ollama.container
[Unit]
Description=Ollama Service
After=network-online.target
The output of a system control command showing the error is as follows:
xxxxx@xxxxx:~$ systemctl --user status ollama.service webui.service
Unit ollama.service could not be found.
○ webui.service - Open WebUI Service
Loaded: loaded (/home/ashleysheardown/.config/containers/systemd/webui.container; generated)
Drop-In: /usr/lib/systemd/user/service.d
└─10-timeout-abort.conf
Active: inactive (dead)
Any help would be appreciated. I have a complete record of the configuration/troubleshooting process if further detail is required.
Also, would you mind posting console output in a preformatted block (the </> button in the toolbar)? As you can see in my posts, this makes it much easier to read as a block with a grey background and fixed width font.
Hi Lars, the system appears to have started correctly, and I am now able to load and use models in Open WebUI. Thank you so very much for your insight and willingness to help. The Device vs AddDevice issue seems to have significantly solved my system problems. What is your interest in Fedora and AI? What is your expertise? I am an Environmental Scientist with an enduring interest in Open-Source systems and a desire to automate as many systems as possible in my work to provide better and faster results, while also having more time for other things I love. In terms of Linux, I started using Red Hat in 1999/2000 and then tried many other OSs. However, the dominant OSs were Ubuntu and Linux Mint for a very long time, and now Fedora, which I have been using for the last 5=10 years. Anyway, this probably isn’t the best place for general sharing and conversation. Thanks again. I am happy to communicate elsewhere if that is of interest.
Great, happy to help. I am marking this as the solution, just in case somebody else comes across this thread in the future.
I am pretty sure that was the reason why the generator failed and you didn’t have an ollama.service unit.
The main reason I got involved with your issue was for the use of quadlets, which I picked up a while ago for Fedora CoreOS. As for LLMs, I am not a huge fan. I see the potential especially for language-related tasks (duh, Large Language Models ) but I also see a lot of people who now rely on LLMs too much and appear to be losing the ability to reason about things themselves.
True, people usually make an introduction post in the “Water Cooler” section to say hi and tell a bit about themselves, if they want to.