AI Personal Learning
and practical guidance
TRAE

Ollama Installation and Configuration - Linux Systems

summary

This section learns how to complete the Ollama Installation and configuration of Ollama, as well as updates to Ollama, installation of specific versions, viewing logs and uninstallation.

I. Quick Installation

Ollama Download: https://ollama.com/download

Ollama official home page: https://ollama.com


Ollama official GitHub source code repository: https://github.com/ollama/ollama/

The official website provides a command line quick install.

Ollama Installation and Configuration - Linux System-1

curl -fsSL https://ollama.com/install.sh | sh

Ollama Installation and Configuration - Linux System-2

This command will automatically download the latest version of Ollama and complete the installation. The following commands are common for Ollama usage:

ollama serve         #启动ollama
ollama create        #从模型文件创建模型
ollama show          #显示模型信息
ollama run           #运行模型
ollama pull          #从注册表中拉取模型
ollama push          #将模型推送到注册表
ollama list          #列出模型
ollama cp            #复制模型
ollama rm            #删除模型
ollama help          #获取有关任何命令的帮助信息
  • Verify that the installation is complete, in the Exec Enter at:
ollama -h

The output is as follows: that means the installation was successful 🎉

Ollama Installation and Configuration - Linux System-3

  • Enabling and using Ollama

First, open Ollama in the terminal and hang it in the background

ollama serve

Ollama Installation and Configuration - Linux System-4

library (ollama.com)Here is Ollama's model library, search for the model you want, open a new terminal and start!

ollama run llama3

The download speed depends on your bandwidth and is ready to use ✌Remember to use the control + D Exit Chat

Ollama Installation and Configuration - Linux System-5

II. Manual installation

Note: If an older version of ollama was previously installed, you will need to complete a manual uninstallation.sudo rm -rf /usr/lib/ollama More requirements 👉Referencesofficial website

2.1 Download and unzip the installation package that matches the operating system

curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz
sudo tar -C /usr -xzf ollama-linux-amd64.tgz

AMD GPUsDownload:

curl -L https://ollama.com/download/ollama-linux-amd64-rocm.tgz -o ollama-linux-amd64-rocm.tgz
sudo tar -C /usr -xzf ollama-linux-amd64-rocm.tgz

ARM 64Download:

curl -L https://ollama.com/download/ollama-linux-arm64.tgz -o ollama-linux-arm64.tgz
sudo tar -C /usr -xzf ollama-linux-arm64.tgz

2.2 Start Ollama and verify

Enter the following command to start Ollama:

ollama serve

Verify that ollama is running successfully by opening another terminal and entering the following command

ollama -v

2.3 Add Ollama as a self-starting service (recommended)

First, create users and groups for Ollama:

sudo useradd -r -s /bin/false -U -m -d /usr/share/ollama ollama
sudo usermod -a -G ollama $(whoami)

Then in that position:/etc/systemd/system/ollama.service Creating a service file

[Unit]
Description=Ollama Service
After=network-online.target
[Service]
ExecStart=/usr/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=$PATH"
[Install]
WantedBy=default.target

Finally start the service:

sudo systemctl daemon-reload
sudo systemctl enable ollama

III. Updates

Run the previous installation statement again to update Ollama:

curl -fsSL https://ollama.com/install.sh | sh

Or re-download the latest Ollama package:

curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz
sudo tar -C /usr -xzf ollama-linux-amd64.tgz

IV. Installation of specific versions

set up OLLAMA_VERSIONFields,, you can install the corresponding version

curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.5.7 sh

V. Viewing the log

View the logs of Ollama running as a startup service:

journalctl -e -u ollama

VI. Uninstallation

  • Delete the Ollama service:
sudo systemctl stop ollama
sudo systemctl disable ollama
sudo rm /etc/systemd/system/ollama.service
  • Remove the Ollama binary from the bin directory: /usr/local/bin ,/usr/bin ,/bin
sudo rm $(which ollama)
  • Delete downloaded models and Ollama service users and groups:
sudo rm -r /usr/share/ollama
sudo userdel ollama
sudo groupdel ollama
  • Delete downloaded library files
sudo rm -rf /usr/local/lib/ollama
May not be reproduced without permission:Chief AI Sharing Circle " Ollama Installation and Configuration - Linux Systems
en_USEnglish