--- title: Using Ollama in LobeChat description: >- Learn how to use Ollama in LobeChat, run LLM locally, and experience cutting-edge AI usage. tags: - Ollama - Local LLM - Ollama WebUI - Web UI - API Key --- # Using Ollama in LobeChat {'Using Ollama is a powerful framework for running large language models (LLMs) locally, supporting various language models including Llama 2, Mistral, and more. Now, LobeChat supports integration with Ollama, meaning you can easily enhance your application by using the language models provided by Ollama in LobeChat. This document will guide you on how to use Ollama in LobeChat: