Hi everyone,
I am trying to setup an AI assistant using Alpaca software yet installation of all Models fails.
All help is appreciated.
Hi everyone,
I am trying to setup an AI assistant using Alpaca software yet installation of all Models fails.
No specific knowledge of this app.
But in general if you see a error messages that is truncated try to get the whole message.
Can you click in the error and see the whole of the message?
Can you make the window wider to see the end of the message?
Tried everything but no changes so far.
Have you tried starting alpaca using the terminal? Often applications will print the full error-message in the terminal.
command
$ flatpak run com.jeffser.Alpaca
$ flatpak run com.jeffser.Alpaca
INFO [main.py | main] Alpaca version: 3.5.0
INFO [connection_handler.py | start] Starting Alpaca's Ollama instance...
INFO [connection_handler.py | start] Started Alpaca's Ollama instance
INFO [connection_handler.py | start] client version is 0.5.4
libEGL warning: wayland-egl: could not open /dev/dri/renderD128 (No such file or directory)
libEGL warning: wayland-egl: could not open /dev/dri/renderD128 (No such file or directory)
INFO [model_widget.py | pull_model] Pulling model: llama3.2-vision:11b
INFO [window.py | show_toast] Error pulling 'llama3.2-vision:11b': pull model manifest: Get "https://registry.ollama.ai/v2/library/llama3.2-vision/manifests/11b": dial tcp: lookup registry.ollama.ai: Temporary failure in name resolution
INFO [window.py | closing_app] Closing app...
INFO [connection_handler.py | stop] Stopping Alpaca's Ollama instance
INFO [connection_handler.py | stop] Stopped Alpaca's Ollama instance
The error indicates that alpaca can’t connect to registry.ollama.ai
. More specifically it can’t fetch the domain name.
What does the following command return for you? If this returns an error we know that it is not a problem with the application but something with your networking.
$ ping registry.ollama.ai
$ ping registry.ollama.ai
PING registry.ollama.ai (104.21.75.227) 56(84) bytes of data.
64 bytes from 104.21.75.227: icmp_seq=1 ttl=54 time=20.8 ms
64 bytes from 104.21.75.227: icmp_seq=2 ttl=54 time=22.3 ms
64 bytes from 104.21.75.227: icmp_seq=3 ttl=54 time=20.9 ms
64 bytes from 104.21.75.227: icmp_seq=4 ttl=54 time=21.9 ms
64 bytes from 104.21.75.227: icmp_seq=5 ttl=54 time=25.0 ms
64 bytes from 104.21.75.227: icmp_seq=6 ttl=54 time=22.0 ms
64 bytes from 104.21.75.227: icmp_seq=7 ttl=54 time=23.0 ms
64 bytes from 104.21.75.227: icmp_seq=8 ttl=54 time=24.0 ms
64 bytes from 104.21.75.227: icmp_seq=9 ttl=54 time=27.0 ms
64 bytes from 104.21.75.227: icmp_seq=10 ttl=54 time=32.6 ms
64 bytes from 104.21.75.227: icmp_seq=11 ttl=54 time=30.9 ms
64 bytes from 104.21.75.227: icmp_seq=12 ttl=54 time=25.0 ms
64 bytes from 104.21.75.227: icmp_seq=13 ttl=54 time=20.9 ms
64 bytes from 104.21.75.227: icmp_seq=14 ttl=54 time=27.0 ms
64 bytes from 104.21.75.227: icmp_seq=15 ttl=54 time=22.8 ms
64 bytes from 104.21.75.227: icmp_seq=16 ttl=54 time=39.9 ms
64 bytes from 104.21.75.227: icmp_seq=17 ttl=54 time=21.8 ms
64 bytes from 104.21.75.227: icmp_seq=18 ttl=54 time=24.9 ms
64 bytes from 104.21.75.227: icmp_seq=19 ttl=54 time=27.9 ms
64 bytes from 104.21.75.227: icmp_seq=20 ttl=54 time=24.9 ms
64 bytes from 104.21.75.227: icmp_seq=21 ttl=54 time=21.9 ms
64 bytes from 104.21.75.227: icmp_seq=22 ttl=54 time=28.8 ms
64 bytes from 104.21.75.227: icmp_seq=23 ttl=54 time=21.1 ms
64 bytes from 104.21.75.227: icmp_seq=24 ttl=54 time=23.8 ms
64 bytes from 104.21.75.227: icmp_seq=25 ttl=54 time=21.9 ms
64 bytes from 104.21.75.227: icmp_seq=26 ttl=54 time=30.9 ms
64 bytes from 104.21.75.227: icmp_seq=27 ttl=54 time=22.9 ms
64 bytes from 104.21.75.227: icmp_seq=28 ttl=54 time=20.9 ms
64 bytes from 104.21.75.227: icmp_seq=29 ttl=54 time=20.8 ms
64 bytes from 104.21.75.227: icmp_seq=30 ttl=54 time=24.9 ms
64 bytes from 104.21.75.227: icmp_seq=31 ttl=54 time=23.9 ms
64 bytes from 104.21.75.227: icmp_seq=32 ttl=54 time=28.1 ms
64 bytes from 104.21.75.227: icmp_seq=33 ttl=54 time=21.0 ms
64 bytes from 104.21.75.227: icmp_seq=34 ttl=54 time=21.0 ms
64 bytes from 104.21.75.227: icmp_seq=35 ttl=54 time=20.8 ms
64 bytes from 104.21.75.227: icmp_seq=36 ttl=54 time=21.9 ms
64 bytes from 104.21.75.227: icmp_seq=37 ttl=54 time=24.0 ms
64 bytes from 104.21.75.227: icmp_seq=38 ttl=54 time=26.9 ms
64 bytes from 104.21.75.227: icmp_seq=39 ttl=54 time=22.9 ms
64 bytes from 104.21.75.227: icmp_seq=40 ttl=54 time=22.0 ms
64 bytes from 104.21.75.227: icmp_seq=41 ttl=54 time=30.0 ms
64 bytes from 104.21.75.227: icmp_seq=42 ttl=54 time=44.1 ms
64 bytes from 104.21.75.227: icmp_seq=43 ttl=54 time=23.9 ms
64 bytes from 104.21.75.227: icmp_seq=44 ttl=54 time=28.9 ms
64 bytes from 104.21.75.227: icmp_seq=45 ttl=54 time=23.9 ms
64 bytes from 104.21.75.227: icmp_seq=46 ttl=54 time=25.8 ms
64 bytes from 104.21.75.227: icmp_seq=47 ttl=54 time=25.0 ms
64 bytes from 104.21.75.227: icmp_seq=48 ttl=54 time=22.9 ms
64 bytes from 104.21.75.227: icmp_seq=49 ttl=54 time=32.1 ms
64 bytes from 104.21.75.227: icmp_seq=50 ttl=54 time=22.0 ms
64 bytes from 104.21.75.227: icmp_seq=51 ttl=54 time=22.7 ms
64 bytes from 104.21.75.227: icmp_seq=52 ttl=54 time=27.0 ms
64 bytes from 104.21.75.227: icmp_seq=53 ttl=54 time=26.9 ms
64 bytes from 104.21.75.227: icmp_seq=54 ttl=54 time=30.0 ms
64 bytes from 104.21.75.227: icmp_seq=55 ttl=54 time=30.9 ms
64 bytes from 104.21.75.227: icmp_seq=56 ttl=54 time=21.9 ms
64 bytes from 104.21.75.227: icmp_seq=57 ttl=54 time=21.0 ms
64 bytes from 104.21.75.227: icmp_seq=58 ttl=54 time=34.0 ms
64 bytes from 104.21.75.227: icmp_seq=59 ttl=54 time=31.0 ms
64 bytes from 104.21.75.227: icmp_seq=60 ttl=54 time=38.0 ms
64 bytes from 104.21.75.227: icmp_seq=61 ttl=54 time=38.0 ms
64 bytes from 104.21.75.227: icmp_seq=62 ttl=54 time=25.0 ms
64 bytes from 104.21.75.227: icmp_seq=63 ttl=54 time=19.9 ms
64 bytes from 104.21.75.227: icmp_seq=64 ttl=54 time=32.0 ms
64 bytes from 104.21.75.227: icmp_seq=65 ttl=54 time=30.9 ms
64 bytes from 104.21.75.227: icmp_seq=66 ttl=54 time=19.9 ms
64 bytes from 104.21.75.227: icmp_seq=67 ttl=54 time=23.0 ms
64 bytes from 104.21.75.227: icmp_seq=68 ttl=54 time=31.0 ms
64 bytes from 104.21.75.227: icmp_seq=69 ttl=54 time=25.0 ms
64 bytes from 104.21.75.227: icmp_seq=70 ttl=54 time=28.0 ms
64 bytes from 104.21.75.227: icmp_seq=71 ttl=54 time=22.9 ms
64 bytes from 104.21.75.227: icmp_seq=72 ttl=54 time=28.8 ms
64 bytes from 104.21.75.227: icmp_seq=73 ttl=54 time=23.9 ms
64 bytes from 104.21.75.227: icmp_seq=74 ttl=54 time=21.0 ms
64 bytes from 104.21.75.227: icmp_seq=75 ttl=54 time=23.3 ms
64 bytes from 104.21.75.227: icmp_seq=76 ttl=54 time=22.0 ms
64 bytes from 104.21.75.227: icmp_seq=77 ttl=54 time=26.0 ms
64 bytes from 104.21.75.227: icmp_seq=78 ttl=54 time=36.1 ms
64 bytes from 104.21.75.227: icmp_seq=79 ttl=54 time=22.0 ms
64 bytes from 104.21.75.227: icmp_seq=80 ttl=54 time=22.9 ms
64 bytes from 104.21.75.227: icmp_seq=81 ttl=54 time=24.9 ms
64 bytes from 104.21.75.227: icmp_seq=82 ttl=54 time=25.1 ms
64 bytes from 104.21.75.227: icmp_seq=83 ttl=54 time=25.7 ms
64 bytes from 104.21.75.227: icmp_seq=84 ttl=54 time=22.0 ms
64 bytes from 104.21.75.227: icmp_seq=85 ttl=54 time=21.9 ms
64 bytes from 104.21.75.227: icmp_seq=86 ttl=54 time=32.9 ms
64 bytes from 104.21.75.227: icmp_seq=87 ttl=54 time=22.0 ms
64 bytes from 104.21.75.227: icmp_seq=88 ttl=54 time=27.0 ms
64 bytes from 104.21.75.227: icmp_seq=89 ttl=54 time=27.0 ms
64 bytes from 104.21.75.227: icmp_seq=90 ttl=54 time=22.9 ms
64 bytes from 104.21.75.227: icmp_seq=91 ttl=54 time=21.8 ms
64 bytes from 104.21.75.227: icmp_seq=92 ttl=54 time=24.0 ms
64 bytes from 104.21.75.227: icmp_seq=93 ttl=54 time=24.7 ms
64 bytes from 104.21.75.227: icmp_seq=94 ttl=54 time=30.1 ms
64 bytes from 104.21.75.227: icmp_seq=95 ttl=54 time=24.1 ms
64 bytes from 104.21.75.227: icmp_seq=96 ttl=54 time=27.0 ms
64 bytes from 104.21.75.227: icmp_seq=97 ttl=54 time=26.0 ms
64 bytes from 104.21.75.227: icmp_seq=98 ttl=54 time=22.0 ms
64 bytes from 104.21.75.227: icmp_seq=99 ttl=54 time=23.0 ms
64 bytes from 104.21.75.227: icmp_seq=100 ttl=54 time=24.0 ms
64 bytes from 104.21.75.227: icmp_seq=101 ttl=54 time=26.9 ms
64 bytes from 104.21.75.227: icmp_seq=102 ttl=54 time=25.8 ms
64 bytes from 104.21.75.227: icmp_seq=103 ttl=54 time=29.1 ms
64 bytes from 104.21.75.227: icmp_seq=104 ttl=54 time=24.0 ms
64 bytes from 104.21.75.227: icmp_seq=105 ttl=54 time=21.9 ms
64 bytes from 104.21.75.227: icmp_seq=106 ttl=54 time=25.0 ms
64 bytes from 104.21.75.227: icmp_seq=107 ttl=54 time=21.8 ms
64 bytes from 104.21.75.227: icmp_seq=108 ttl=54 time=22.1 ms
64 bytes from 104.21.75.227: icmp_seq=109 ttl=54 time=27.7 ms
64 bytes from 104.21.75.227: icmp_seq=110 ttl=54 time=20.9 ms
64 bytes from 104.21.75.227: icmp_seq=111 ttl=54 time=28.0 ms
64 bytes from 104.21.75.227: icmp_seq=112 ttl=54 time=23.7 ms
64 bytes from 104.21.75.227: icmp_seq=113 ttl=54 time=30.1 ms
64 bytes from 104.21.75.227: icmp_seq=114 ttl=54 time=28.0 ms
64 bytes from 104.21.75.227: icmp_seq=115 ttl=54 time=21.0 ms
64 bytes from 104.21.75.227: icmp_seq=116 ttl=54 time=28.9 ms
64 bytes from 104.21.75.227: icmp_seq=117 ttl=54 time=24.0 ms
64 bytes from 104.21.75.227: icmp_seq=118 ttl=54 time=26.0 ms
64 bytes from 104.21.75.227: icmp_seq=119 ttl=54 time=30.1 ms
64 bytes from 104.21.75.227: icmp_seq=120 ttl=54 time=21.0 ms
64 bytes from 104.21.75.227: icmp_seq=121 ttl=54 time=22.0 ms
64 bytes from 104.21.75.227: icmp_seq=122 ttl=54 time=45.0 ms
64 bytes from 104.21.75.227: icmp_seq=123 ttl=54 time=21.1 ms
64 bytes from 104.21.75.227: icmp_seq=124 ttl=54 time=25.9 ms
64 bytes from 104.21.75.227: icmp_seq=125 ttl=54 time=26.9 ms
64 bytes from 104.21.75.227: icmp_seq=126 ttl=54 time=21.0 ms
64 bytes from 104.21.75.227: icmp_seq=127 ttl=54 time=25.8 ms
64 bytes from 104.21.75.227: icmp_seq=128 ttl=54 time=21.0 ms
64 bytes from 104.21.75.227: icmp_seq=129 ttl=54 time=24.1 ms
64 bytes from 104.21.75.227: icmp_seq=130 ttl=54 time=20.0 ms
64 bytes from 104.21.75.227: icmp_seq=131 ttl=54 time=31.0 ms
64 bytes from 104.21.75.227: icmp_seq=132 ttl=54 time=25.7 ms
64 bytes from 104.21.75.227: icmp_seq=133 ttl=54 time=24.1 ms
64 bytes from 104.21.75.227: icmp_seq=134 ttl=54 time=26.0 ms
64 bytes from 104.21.75.227: icmp_seq=135 ttl=54 time=24.9 ms
64 bytes from 104.21.75.227: icmp_seq=136 ttl=54 time=26.6 ms
64 bytes from 104.21.75.227: icmp_seq=137 ttl=54 time=25.0 ms
64 bytes from 104.21.75.227: icmp_seq=138 ttl=54 time=30.0 ms
64 bytes from 104.21.75.227: icmp_seq=139 ttl=54 time=20.9 ms
64 bytes from 104.21.75.227: icmp_seq=140 ttl=54 time=24.1 ms
64 bytes from 104.21.75.227: icmp_seq=141 ttl=54 time=30.9 ms
64 bytes from 104.21.75.227: icmp_seq=142 ttl=54 time=22.5 ms
64 bytes from 104.21.75.227: icmp_seq=143 ttl=54 time=21.9 ms
^C
--- registry.ollama.ai ping statistics ---
143 packets transmitted, 143 received, 0% packet loss, time 142176ms
rtt min/avg/max/mdev = 19.857/25.525/45.025/4.554 ms
Should have included -c 5
so it doesn’t run infinitely, my bad
But your networking is fine so seems like something strange is happening with the alpaca flatpak. Do you use other flatpaks, and if so does networking work fine there?
I don’t really know where to go from here, but at least we have some useful information now.
I agree, there’s something strange happening with the alpaca flatpak. There might be a way of solving it, I will try it now and if it works, I will post it here as the solution.
Sure, hope you can get it working. There might also be some other discussion users that have an idea of what is happening.
And just to double-check, you haven’t turned off the networking of the alpaca-flatpak by using flatseal or a command right?
Flatpak in question had networking turned off by default for some reason. Other flatpaks don’t share this quality. It’s working and everything seems to be going as it should.
My idea is to feed the AI model documents and than question it for details. Any thoughts on that are appreciated.
That’s strange, it shouldn’t do that. But great you got it working
Seems like a good use-case for LLMs. I have only played around with the smaller models, I don’t have a good gpu in my laptop so any larger model is very slow. If you have an amd-gpu don’t forget to install the com.jeffser.Alpaca.Plugins.AMD
flatpak-extension so it utilizes the gpu.
I am thinking of testing Llama 3.2 Vision and then go from depending of the results.
Thank you, that’s a great input.
Tried installing it but this was the response
$ com.jeffser.Alpaca.Plugins.AMD f
bash: com.jeffser.Alpaca.Plugins.AMD: command not found...
That’s the package name, the command would be flatpak install com.jeffser.Alpaca.Plugins.AMD
.
It’s also possible to install it using gnome-software. It should be an option on the alpaca page.
Oh, my mistake. thank you.