AI-ML
Ollama v0.17.0
RESUMEN
OpenClaw OpenClaw can now be installed and configured automatically via Ollama, making it the easiest way to get up and running with OpenClaw with open models like Kimi-K2.5, GLM-5, and Minimax-M2.5. Get started `ollama launch openclaw` <img width="2368" height="1830" alt="oc1" sr
Descripción Detallada
OpenClaw OpenClaw can now be installed and configured automatically via Ollama, making it the easiest way to get up and running with OpenClaw with open models like Kimi-K2.5, GLM-5, and Minimax-M2.5. Get started `ollama launch openclaw` Web search in OpenClaw When using cloud models, websearch is enabled - allowing OpenClaw to search the internet. What's Changed Improved tokenizer performance Ollama's macOS and Windows apps will now default to a context length based on available VRAM New Contributors * @natl-set made their first contribution in Full Changelog:
Ollama v0.17.0 permite instalar OpenClaw automáticamente y mejora el rendimiento del tokenizador.
- Instalación y configuración automática de OpenClaw a través de Ollama.
- Mejora en el rendimiento del tokenizador.
- Las apps de Ollama en macOS y Windows ahora ajustan la longitud de contexto según la VRAM disponible.
A quién le importa
Todos los que usan Ollama y OpenClaw.
Generado por IA · puede contener errores
Releases Relacionados
AI-ML
Ollama v0.23.0
## Claude Desktop with Ollama Launch <img width="1272" height="872" alt="ca1" src="https://github.com/user-attachments/assets/1d550e3f-0272-4429-8cb2-06d32344cb77" /> Claude Desktop is now supported with Ollama Launch. Both Claude Cowork and Claude Code are supported within the Claude Desktop Ap
AI-ML
Ollama v0.22.1
## What's Changed * Updated the **Gemma 4** renderer for thinking and tool calling improvements * Model recommendations are now updated without updating Ollama * Aligned the desktop app's launch page with `ollama launch` integrations * Fixed the Poolside integration title in `ollama launch`