Using Qwen 2.5 models via Ollama for local LLM inference, text analysis, and AI-powered automation
选择开发工具安装