部署大模型:解决ollama.service: Failed with result ‘exit-code‘的问题
起因是这样:
Loaded: loaded (/etc/systemd/system/ollama.service; disabled; preset: enabled)
Active: activating (auto-restart) (Result: exit-code) since Tue 2025-05-13 19:31:19 CST; >
Process: 12272 ExecStart=/usr/bin/ollama serve (code=exited, status=1/FAILURE)
Main PID: 12272 (code=exited, status=1/FAILURE)
CPU: 18ms
当我运行sudo systemctl start ollama
的时候,我以为没问题了,于是就继续运行sudo systemctl status ollama
,出现了上面这一奇怪现象
于是我用sudo journalctl -u ollama -f
查看了相关日志报错
这个为文件拒绝访问导致,于是我参照了网上一篇博客所写,查看ollama 默认路径:/usr/share/ollama/.ollama/models
然后更让我震惊的来了
权限不够??嗯??那就给你改一波!
sudo chmod -R 755 /usr/share/ollama # 允许 ollama 用户读写,其他用户只读
很好成功了!