Ollama is an open-source project that serves as a powerful and user-friendly platform for running LLMs on your local machine.
https://ollama.com/
chat mode server mode
user env:
Note: 如果系统变量不行,改成用户变量,Ollama prioritizes user environment variables over system ones when loading model paths; meaning the system-wide setting is being ignored unless explicitly configured otherwise.
Port: 11434
Log path: \Users\meesi\AppData\Local\Ollama\Server.log
Models:
ollama run huihui_ai/deepseek-r1-abliterated
0基础!一行代码部署Gemma!纯本地!主打一个快!不用搞依赖!7B尺寸超13B性能!附模型下载! https://mp.weixin.qq.com/s/4hjewv3TFI5fqe66PaT6Tg