diff --git a/content/en/posts/llama-cpp/ollama-with-deepseek-r1/environment-variables.avif b/content/en/posts/llama-cpp/ollama-with-deepseek-r1/environment-variables.avif new file mode 100644 index 0000000..820bee9 Binary files /dev/null and b/content/en/posts/llama-cpp/ollama-with-deepseek-r1/environment-variables.avif differ diff --git a/content/en/posts/llama-cpp/ollama-with-deepseek-r1/index.md b/content/en/posts/llama-cpp/ollama-with-deepseek-r1/index.md index e22ae39..3df7c16 100644 --- a/content/en/posts/llama-cpp/ollama-with-deepseek-r1/index.md +++ b/content/en/posts/llama-cpp/ollama-with-deepseek-r1/index.md @@ -162,6 +162,11 @@ This can be seen in the official [chart](https://raw.githubusercontent.com/deeps Ollama provides a convenient interface and tools for using and managing models, with the backend being llama.cpp. It supports both CPU and GPU inference optimization. +{{< gh-repo-card-container >}} + {{< gh-repo-card repo="ollama/ollama" >}} + {{< gh-repo-card repo="ggerganov/llama.cpp" >}} +{{< /gh-repo-card-container >}} + ## Installation of Ollama Follow the instructions on [Download Ollama](https://ollama.com/download) to complete the installation. My environment is as follows: @@ -247,7 +252,13 @@ Flash Attention must be enabled. I recommend setting `OLLAMA_KV_CACHE_TYPE` to ` ### Windows 11 -To set environment variables on Windows 11, go to "Advanced System Settings," then choose "Environment Variables." After that, select "New" to add a new variable. Restart Ollama for changes to take effect. +To set environment variables on Windows 11, go to "Advanced System Settings," then choose "Environment Variables." + +{{< image src="system-properties.avif" width="320px" caption="System Properties" >}} + +After that, select "New" to add a new variable. Restart Ollama for changes to take effect. + +{{< image src="environment-variables.avif" width="480px" caption="Environment Variables" >}} ### MacOS diff --git a/content/en/posts/llama-cpp/ollama-with-deepseek-r1/system-properties.avif b/content/en/posts/llama-cpp/ollama-with-deepseek-r1/system-properties.avif new file mode 100644 index 0000000..8ac6893 Binary files /dev/null and b/content/en/posts/llama-cpp/ollama-with-deepseek-r1/system-properties.avif differ diff --git a/content/zh-cn/posts/llama-cpp/ollama-with-deepseek-r1/environment-variables.avif b/content/zh-cn/posts/llama-cpp/ollama-with-deepseek-r1/environment-variables.avif new file mode 100644 index 0000000..820bee9 Binary files /dev/null and b/content/zh-cn/posts/llama-cpp/ollama-with-deepseek-r1/environment-variables.avif differ diff --git a/content/zh-cn/posts/llama-cpp/ollama-with-deepseek-r1/index.md b/content/zh-cn/posts/llama-cpp/ollama-with-deepseek-r1/index.md index 34261e8..7e10b3c 100644 --- a/content/zh-cn/posts/llama-cpp/ollama-with-deepseek-r1/index.md +++ b/content/zh-cn/posts/llama-cpp/ollama-with-deepseek-r1/index.md @@ -160,7 +160,7 @@ repost: } {{< /echarts >}} -Ollama提供了更方便使用和管理模型的接口和工具,后端是llama.cpp。基于CPU推理优化的工具,也支持GPU。 +Ollama提供了更方便使用和管理模型的接口和工具,它的后端是llama.cpp。一个基于CPU推理优化的工具,也支持GPU。 {{< gh-repo-card-container >}} {{< gh-repo-card repo="ollama/ollama" >}} @@ -252,7 +252,13 @@ Flash Attention是必开的,KV Cache我建议选`q8_0`,实测发现`q4_0`会 ### Windows 11 -要在Windows 11中设置环境变量,需要进入“高级系统设置”,然后选择“环境变量”,之后选择“新建”。重启Ollama使其生效。 +要在Windows 11中设置环境变量,需要进入“高级系统设置”, + +{{< image src="system-properties.avif" width="320px" caption="System Properties" >}} + +然后选择“环境变量”,之后选择“新建”。重启Ollama使其生效。 + +{{< image src="environment-variables.avif" width="480px" caption="Environment Variables" >}} ### MacOS diff --git a/content/zh-cn/posts/llama-cpp/ollama-with-deepseek-r1/system-properties.avif b/content/zh-cn/posts/llama-cpp/ollama-with-deepseek-r1/system-properties.avif new file mode 100644 index 0000000..8ac6893 Binary files /dev/null and b/content/zh-cn/posts/llama-cpp/ollama-with-deepseek-r1/system-properties.avif differ