成人免费xxxxx在线视频软件_久久精品久久久_亚洲国产精品久久久_天天色天天色_亚洲人成一区_欧美一级欧美三级在线观看

一文讀懂如何基于 Ollama 在本地運行 LLM

人工智能
作為一個功能強大且使用便捷的工具套件,Ollama不僅為用戶提供了精心策源的優質模型備選,更允許直接引入自有的自定義模型,確保了最大程度的靈活性和定制空間。

為什么要使用 Ollama 開源項目 ?

在當今人工智能技術飛速發展的時代,大型語言模型(LLM)無疑已成為焦點炯炯的科技明星。自從ChatGPT的推出以來,其強大的自然語言理解和生成能力便驚艷了全球,成為人工智能商業化進程中的杰出代表。

然而,這一領域并非只有諸如 OpenAI的GPT、Anthropic的Claude、谷歌的Gemini等知名商業模型才折桂爭芳。事實上,開源LLM同樣正在蓬勃發展,為用戶提供了更加多元、靈活和經濟的選擇。無論是希望嘗試當下無法作為公共服務獲取的前沿模型,還是追求絕對的數據隱私保護,亦或是想以最優成本獲取滿足需求的性能,開源LLM無疑都是不二之選。

而在開源LLM的部署和使用方面,Ollama則可謂是獨樹一幟的優秀方案。作為一個功能強大且使用便捷的工具套件,Ollama不僅為用戶提供了精心策源的優質模型備選,更允許直接引入自有的自定義模型,確保了最大程度的靈活性和定制空間。

無論是本地部署還是云端服務,Ollama的簡潔設計和完備功能確保了LLM的無縫集成和高效運行。用戶可以輕松調用各種計算資源,如GPU加速等,充分釋放模型的算力潛能。同時,Ollama 還提供了Web UI、API等多種友好的交互界面,使得用戶可以根據不同場景和需求,自如切換使用模式。

更為重要的是,Ollama 在隱私和安全性方面也作出了完備的部署。通過本地化運行和數據加密等策略,確保了對話內容和用戶數據的完全保密,有效回應了人們對數據安全日益增長的憂慮。在一些涉及敏感信息或隱私法規的領域,Ollama無疑是LLM部署的最佳選擇。

而在經濟性方面,Ollama 同樣大放異彩。龐大的開源模型儲備庫使得用戶可以根據實際需求,自由挑選大小適中、性價比極高的模型。同時,Ollama的輕量級設計也減小了部署和運維的系統開銷,進一步降低了總體成本。

總的來說,Ollama 正在為開源LLM的發展和應用插上騰飛的翅膀。通過其靈活的定制能力、出色的性能表現、良好的隱私保護和卓越的經濟性,Ollama必將吸引越來越多的個人和企業用戶加入到開源LLM的陣營,共同推動這一人工智能前沿技術的創新和落地。在通往人工通用智能的道路上,開源LLM和Ollama將扮演越來越重要的角色。

Ollama 開源項目獨有的優勢

運行Ollama的優勢眾多,它是一個備受歡迎的開源項目,提供了在本地部署和運行大型語言模型的能力。其獨特特性可參考如下所示:

(1) 積極的維護和更新

Ollama 項目得到積極的維護和更新。開發團隊不斷改進和優化該項目,確保其性能和功能的最新狀態。這意味著我們可以依賴于穩定的支持,并享受到最新的改進和修復。

(2) 龐大而活躍的社區支持

Ollama擁有一個龐大而活躍的社區。您可以加入這個社區,與其他用戶和開發者交流、分享經驗和解決問題。社區的活躍性意味著我們可以獲得及時的幫助和支持,共同推動項目的發展。

(3) 靈活的部署選項

Ollama提供了便捷的docker容器,使我們可以在本地運行LLM,而無需擔心繁瑣的安裝過程。這為您提供了更大的靈活性,可以根據自己的需求選擇合適的部署方式。

(4) 簡單易用

Ollama的使用非常簡單。它提供了直觀的接口和清晰的文檔,使得初學者和有經驗的開發者都能夠快速上手。我們可以輕松地配置和管理模型,進行文本生成和處理任務。

(5) 多樣化的模型支持

Ollama支持多個模型,并不斷添加新的模型。這為我們提供了更多的選擇,可以根據具體需求選擇適合的模型。此外,您還可以引入自定義模型,以滿足個性化的需求。

(6) 強大的前端支持

Ollama 還提供了優秀的第三方開源前端工具,稱為"Ollama Web-UI"。這個前端工具使得使用Ollama更加便捷和直觀,我們可以通過它進行模型的交互和可視化展示。

(7) 多模態模型支持

除了固有的文本處理特性支持,Ollama還支持多模態模型。這意味著我們可以處理多種類型的數據,例如圖像、音頻和視頻等。這種多模態的支持為您提供了更廣泛的應用領域和更豐富的功能。

綜上所述,運行Ollama帶來了諸多優勢。積極的維護和更新、龐大而活躍的社區支持、靈活的部署選項、簡單易用的接口、多樣化的模型支持、強大的前端工具以及多模態模型的能力,都使得 Ollama 成為一個強大而受歡迎的工具,為用戶提供了豐富的功能和便利的開發體驗。

本地運行 Ollama 基本操作步驟

1. 部署 Ollama

這里,我們以 Mac 平臺為操作環境,基于 Docker 進行安裝部署,具體可參考如下命令:

[lugalee@Labs ~ ] % docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama

[lugalee@Labs ~ ] % docker ps
CONTAINER ID   IMAGE           COMMAND               CREATED      STATUS          PORTS                                           NAMES
cef2b5f8510c   ollama/ollama   "/bin/ollama serve"   7 days ago   Up 31 seconds   0.0.0.0:11434->11434/tcp, :::11434->11434/tcp   ollama
[lugalee@Labs ~ ] % docker logs -f cef2b5f8510c
Couldn't find '/root/.ollama/id_ed25519'. Generating new private key.
Your new public key is: 

ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIJjLw5r9XVTA5HWmFMiVR+ulwVj2jtWLWgcD1yTOZb03

2024/05/27 07:08:17 routes.go:1008: INFO server config env="map[OLLAMA_DEBUG:false OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:1 OLLAMA_MAX_QUEUE:512 OLLAMA_MAX_VRAM:0 OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:*] OLLAMA_RUNNERS_DIR: OLLAMA_TMPDIR:]"
time=2024-05-27T07:08:17.576Z level=INFO source=images.go:704 msg="total blobs: 0"
time=2024-05-27T07:08:17.576Z level=INFO source=images.go:711 msg="total unused blobs removed: 0"
time=2024-05-27T07:08:17.576Z level=INFO source=routes.go:1054 msg="Listening on [::]:11434 (version 0.1.38)"
time=2024-05-27T07:08:17.576Z level=INFO source=payload.go:30 msg="extracting embedded files" dir=/tmp/ollama2516332727/runners
time=2024-05-27T07:08:19.764Z level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cuda_v11 cpu]"
time=2024-05-27T07:08:19.766Z level=INFO source=types.go:71 msg="inference compute" id=0 library=cpu compute="" driver=0.0 name="" total="2.5 GiB" available="1.3 GiB"
2024/05/27 08:29:29 routes.go:1008: INFO server config env="map[OLLAMA_DEBUG:false OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:1 OLLAMA_MAX_QUEUE:512 OLLAMA_MAX_VRAM:0 OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:*] OLLAMA_RUNNERS_DIR: OLLAMA_TMPDIR:]"
time=2024-05-27T08:29:29.102Z level=INFO source=images.go:704 msg="total blobs: 0"
time=2024-05-27T08:29:29.102Z level=INFO source=images.go:711 msg="total unused blobs removed: 0"
time=2024-05-27T08:29:29.102Z level=INFO source=routes.go:1054 msg="Listening on [::]:11434 (version 0.1.38)"
time=2024-05-27T08:29:29.103Z level=INFO source=payload.go:30 msg="extracting embedded files" dir=/tmp/ollama2886309190/runners
time=2024-05-27T08:29:31.318Z level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu cuda_v11]"
time=2024-05-27T08:29:31.320Z level=INFO source=types.go:71 msg="inference compute" id=0 library=cpu compute="" driver=0.0 name="" total="2.5 GiB" available="1.4 GiB"
[GIN] 2024/05/27 - 08:41:23 | 200 |     359.716μs |       127.0.0.1 | HEAD     "/"
[GIN] 2024/05/27 - 08:41:23 | 404 |     773.017μs |       127.0.0.1 | POST     "/api/show"
time=2024-05-27T08:41:31.891Z level=INFO source=download.go:136 msg="downloading 6a0746a1ec1a in 47 100 MB part(s)"
time=2024-05-27T08:41:38.716Z level=INFO source=images.go:1001 msg="request failed: Get \"https://dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com/ollama/docker/registry/v2/blobs/sha256/6a/6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa/data?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=66040c77ac1b787c3af820529859349a%2F20240527%2Fauto%2Fs3%2Faws4_request&X-Amz-Date=20240527T084133Z&X-Amz-Expires=1200&X-Amz-SignedHeaders=host&X-Amz-Signature=618deb64019fbd08450a5b0dc842810c294fd9f7bafb22f3cb29046d2b1379db\": EOF"
time=2024-05-27T08:41:38.716Z level=INFO source=download.go:178 msg="6a0746a1ec1a part 45 attempt 0 failed: Get \"https://dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com/ollama/docker/registry/v2/blobs/sha256/6a/6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa/data?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=66040c77ac1b787c3af820529859349a%2F20240527%2Fauto%2Fs3%2Faws4_request&X-Amz-Date=20240527T084133Z&X-Amz-Expires=1200&X-Amz-SignedHeaders=host&X-Amz-Signature=618deb64019fbd08450a5b0dc842810c294fd9f7bafb22f3cb29046d2b1379db\": EOF, retrying in 1s"
time=2024-05-27T08:42:04.892Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 33 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:42:19.892Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 5 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:42:23.892Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 27 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:43:12.892Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 44 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:43:20.893Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 14 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:43:20.893Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 39 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:43:22.717Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 45 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:43:27.896Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 12 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:43:52.892Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 10 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:43:58.895Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 14 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:44:05.896Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 27 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:44:15.897Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 28 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:44:18.892Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 40 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:44:22.894Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 6 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:44:33.898Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 27 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:44:41.892Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 8 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:45:21.901Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 28 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:45:31.892Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 3 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:45:34.895Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 16 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:46:10.187Z level=INFO source=download.go:178 msg="6a0746a1ec1a part 11 attempt 0 failed: unexpected EOF, retrying in 1s"
time=2024-05-27T08:46:12.892Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 15 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:46:15.188Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 11 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:46:43.892Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 35 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:47:00.458Z level=INFO source=download.go:178 msg="6a0746a1ec1a part 31 attempt 0 failed: unexpected EOF, retrying in 1s"
time=2024-05-27T08:47:10.895Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 1 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:47:30.896Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 10 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:47:38.824Z level=INFO source=download.go:178 msg="6a0746a1ec1a part 9 attempt 0 failed: unexpected EOF, retrying in 1s"
time=2024-05-27T08:47:40.431Z level=INFO source=download.go:178 msg="6a0746a1ec1a part 32 attempt 0 failed: unexpected EOF, retrying in 1s"
time=2024-05-27T08:47:45.433Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 32 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:47:52.892Z level=INFO source=download.go:251 msg="6a0746a1ec1a part 17 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-27T08:49:39.094Z level=INFO source=download.go:136 msg="downloading 4fa551d4f938 in 1 12 KB part(s)"
time=2024-05-27T08:49:42.828Z level=INFO source=download.go:136 msg="downloading 8ab4849b038c in 1 254 B part(s)"
time=2024-05-27T08:49:47.583Z level=INFO source=download.go:136 msg="downloading 577073ffcc6c in 1 110 B part(s)"
time=2024-05-27T08:49:51.368Z level=INFO source=download.go:136 msg="downloading 3f8eb4da87fa in 1 485 B part(s)"
[GIN] 2024/05/27 - 08:49:59 | 200 |         8m35s |       127.0.0.1 | POST     "/api/pull"
[GIN] 2024/05/27 - 08:49:59 | 200 |    1.434741ms |       127.0.0.1 | POST     "/api/show"
[GIN] 2024/05/27 - 08:49:59 | 200 |     893.744μs |       127.0.0.1 | POST     "/api/show"
...

此時,Ollama 已部署 OK。

2. 運行  llama3

接下來,我們可以進行 llama3,具體如下所示:

[lugalee@Labs ~ ] % ollama run llama3
pulling manifest 
pulling 6a0746a1ec1a... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 4.7 GB                         
pulling 4fa551d4f938... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏  12 KB                         
pulling 8ab4849b038c... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏  254 B                         
pulling 577073ffcc6c... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏  110 B                         
pulling 3f8eb4da87fa... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏  485 B                         
verifying sha256 digest 
writing manifest 
removing any unused layers 
success

此時,我們可以與模型進行交互,具體可參考:

>>> Why do people at the bottom like to tear each other apart? 為什么底層人就喜歡互撕?
A thought-provoking question!

It's important to note that not everyone at the "bottom" (whatever that means in a specific context) likes

3. 運行 Ollama Web-UI

接下來,我們使用 Docker 運行 Ollama Web-UI docker 容器,以配合我們的 Ollama 實例。具體如下所示:


[lugalee@Labs ~ ] % docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v ollama-webui:/app/backend/data --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
Unable to find image 'ghcr.io/ollama-webui/ollama-webui:main' locally
main: Pulling from ollama-webui/ollama-webui
f546e941f15b: Pull complete 
24935aba99a7: Pull complete 
07b3e0dc751a: Pull complete 
7e0115596a7a: Pull complete 
a66610a3b2a1: Pull complete 
c175d9a7d72d: Pull complete 
118c270ec011: Pull complete 
2560efb5a1fe: Pull complete 
0a3a251ca3a9: Pull complete 
e66887d25b67: Pull complete 
e138e3ef45c8: Pull complete 
ebe686cd15b0: Pull complete 
cbb3a246d52b: Pull complete 
133390a53c46: Pull complete 
5691cbf1cdd3: Pull complete 
Digest: sha256:d5a5c1126b5decbfbfcac4f2c3d0595e0bbf7957e3fcabc9ee802d3bc66db6d2
Status: Downloaded newer image for ghcr.io/ollama-webui/ollama-webui:main
8fa5ed05075bbcc118451d79d10ea6754fa8c51c47e27016949807ccc60cac4e
[lugalee@Labs ~ ] % docker ps
CONTAINER ID   IMAGE                                    COMMAND               CREATED         STATUS         PORTS                                           NAMES
8fa5ed05075b   ghcr.io/ollama-webui/ollama-webui:main   "bash start.sh"       5 seconds ago   Up 4 seconds   0.0.0.0:3000->8080/tcp, :::3000->8080/tcp       ollama-webui
cef2b5f8510c   ollama/ollama                            "/bin/ollama serve"   7 days ago      Up 2 minutes   0.0.0.0:11434->11434/tcp, :::11434->11434/tcp   ollama
[lugalee@Labs ~ ] % docker logs -f 8fa5ed05075b
No WEBUI_SECRET_KEY provided
Generating WEBUI_SECRET_KEY
Loading WEBUI_SECRET_KEY from .webui_secret_key
INFO:     Started server process [1]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://0.0.0.0:8080 (Press CTRL+C to quit)
INFO:     192.168.215.1:48232 - "GET / HTTP/1.1" 200 OK
INFO:     192.168.215.1:48232 - "GET /manifest.json HTTP/1.1" 200 OK
INFO:     192.168.215.1:48233 - "GET /_app/immutable/entry/start.9d275286.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48236 - "GET /_app/immutable/chunks/index.afef706d.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48232 - "GET /_app/immutable/chunks/index.65c87b32.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48235 - "GET /_app/immutable/chunks/singletons.1c028cc1.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48234 - "GET /_app/immutable/chunks/scheduler.f6668761.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48237 - "GET /_app/immutable/entry/app.653e80bf.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48235 - "GET /_app/immutable/assets/Toaster.3a6d0da3.css HTTP/1.1" 200 OK
INFO:     192.168.215.1:48237 - "GET /_app/immutable/chunks/Toaster.svelte_svelte_type_style_lang.4f4e956a.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48233 - "GET /_app/immutable/chunks/constants.c3d2104f.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48232 - "GET /_app/immutable/nodes/0.e5f3822c.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48236 - "GET /_app/immutable/chunks/each.2cd6bd47.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48234 - "GET /_app/immutable/chunks/navigation.613629d5.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48235 - "GET /_app/immutable/chunks/index.c122cac0.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48233 - "GET /_app/immutable/chunks/index.c4d7520e.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48234 - "GET /_app/immutable/nodes/1.ccc6771d.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48236 - "GET /_app/immutable/assets/0.de36d200.css HTTP/1.1" 200 OK
INFO:     192.168.215.1:48232 - "GET /_app/immutable/nodes/2.02d12cc1.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48237 - "GET /_app/immutable/chunks/stores.81ef0f3d.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48235 - "GET /_app/immutable/chunks/FileSaver.min.898eb36f.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48233 - "GET /_app/immutable/chunks/_commonjsHelpers.de833af9.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48237 - "GET /_app/immutable/chunks/index.24d0f459.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48234 - "GET /_app/immutable/chunks/index.69f6ab4e.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48236 - "GET /_app/immutable/chunks/index.15f26e66.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48233 - "GET /_app/immutable/chunks/index.b7ffcff4.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48232 - "GET /_app/immutable/chunks/index.94bf9275.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48235 - "GET /_app/immutable/chunks/Modal.68c1a0a5.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48236 - "GET /_app/immutable/chunks/index.f3d17e4d.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48237 - "GET /_app/immutable/chunks/AdvancedParams.020cee97.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48234 - "GET /_app/immutable/chunks/index.3f5e7a6d.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48233 - "GET /_app/immutable/assets/2.c7820e19.css HTTP/1.1" 200 OK
INFO:     192.168.215.1:48232 - "GET /_app/immutable/nodes/3.f0821a72.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48236 - "GET /_app/immutable/chunks/Tags.4457bfcc.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48235 - "GET /_app/immutable/chunks/index.4987a420.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48237 - "GET /_app/immutable/chunks/dayjs.min.1e504c00.js HTTP/1.1" 200 OK
INFO:     192.168.215.1:48234 - "GET /_app/immutable/assets/index.8a27bb7e.css HTTP/1.1" 200 OK
INFO:     192.168.215.1:48234 - "GET /api/v1/ HTTP/1.1" 200 OK
...

此命令將運行名為“ollama-webui”的Ollama Web-UI容器,將容器端口8080映射到主機端口3000,并映射一個卷以持久存儲文件。我們可以通過訪問http://127.0.0.1:3000/進行測試訪問,具體可參考:

注冊完成后,我們在本地創建一個賬戶,然后進入主頁進行模型交互,具體可參考如下:

OK,到此為止,盡情玩吧~

Reference :

  • [1]   https://openreview.net/pdf?id=uAjxFFing2
  • [2]   https://microsoft.github.io/autogen/
責任編輯:趙寧寧 來源: 架構驛站
相關推薦

2024-05-28 11:32:01

2024-01-15 05:55:33

2024-05-27 00:45:00

2025-06-12 02:30:00

人工智能LLM大語言模型

2024-01-29 00:28:01

2023-12-22 19:59:15

2021-08-04 16:06:45

DataOps智領云

2024-12-27 16:26:36

人工智能Chatbots自然語言處理

2025-02-11 09:29:07

2023-11-27 17:35:48

ComponentWeb外層

2023-05-20 17:58:31

低代碼軟件

2022-07-05 06:30:54

云網絡網絡云原生

2022-07-26 00:00:03

語言模型人工智能

2022-10-20 08:01:23

2021-12-29 18:00:19

無損網絡網絡通信網絡

2022-12-01 17:23:45

2024-03-07 09:15:57

2018-09-28 14:06:25

前端緩存后端

2022-09-22 09:00:46

CSS單位

2025-04-03 10:56:47

點贊
收藏

51CTO技術棧公眾號

主站蜘蛛池模板: 不卡一区二区在线观看 | 日本成人综合 | 久久国产一区二区三区 | 性色的免费视频 | 亚洲国产一区二区三区在线观看 | 久久精品一级 | 欧美一级全黄 | 夜夜摸夜夜操 | av中文字幕在线观看 | 天天干com| 久久久久久91 | 成人精品一区亚洲午夜久久久 | 国产91久久精品一区二区 | av网站在线播放 | 欧美一区二区三区四区视频 | 国产在线精品一区二区三区 | 麻豆久久精品 | 91xxx在线观看 | 俺去俺来也www色官网cms | 伊人网在线播放 | 一区二区在线不卡 | 亚洲欧美在线观看 | 日韩精品久久久久 | 在线亚洲一区二区 | 国产成人精品久久二区二区91 | 成人黄色电影在线观看 | 欧美日韩一区在线 | 欧美极品一区二区 | 亚洲成人毛片 | 欧美精品中文 | 亚洲精品久久久蜜桃 | 中文在线a在线 | 欧美中文字幕一区二区三区亚洲 | 亚洲精品九九 | 国产精品a级| 精品国产91亚洲一区二区三区www | 久久久久国产一区二区三区四区 | 亚洲一区二区视频 | 久久国| a在线观看免费 | 中文成人在线 |