You cannot select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
29 lines
1.7 KiB
Markdown
29 lines
1.7 KiB
Markdown
---
|
|
title: LobeChat Supports Ollama for Local Large Language Model (LLM) Calls
|
|
description: LobeChat v0.127.0 supports using Ollama to call local large language models.
|
|
tags:
|
|
- Ollama AI
|
|
- LobeChat
|
|
- Local LLMs
|
|
- AI Conversations
|
|
- GPT-4
|
|
---
|
|
|
|
# Support for Ollama Calls to Local Large Language Models 🦙
|
|
|
|
With the release of LobeChat v0.127.0, we're excited to introduce a fantastic new feature—Ollama AI support! 🤯 Thanks to the robust infrastructure provided by [Ollama AI](https://ollama.ai/) and the [efforts of the community](https://github.com/lobehub/lobe-chat/pull/1265), you can now interact with local LLMs (Large Language Models) within LobeChat! 🤩
|
|
|
|
We are thrilled to unveil this revolutionary feature to all LobeChat users at this special moment. The integration of Ollama AI not only represents a significant leap in our technology but also reaffirms our commitment to continuously seek more efficient and intelligent ways of communication with our users.
|
|
|
|
## 💡 How to Start a Conversation with Local LLMs?
|
|
|
|
If you're facing challenges with private deployments, we strongly recommend trying out the LobeChat Cloud service. We offer comprehensive model support to help you easily embark on your AI conversation journey.
|
|
|
|
Experience the newly upgraded LobeChat v1.6 and feel the powerful conversational capabilities brought by GPT-4!
|
|
|
|
```bash
|
|
docker run -d -p 3210:3210 -e OLLAMA_PROXY_URL=http://host.docker.internal:11434/v1 lobehub/lobe-chat
|
|
```
|
|
|
|
Yes, it's that simple! 🤩 You don't need to go through complicated configurations or worry about intricate installation processes. We've prepared everything for you; just one command is all it takes to start deep conversations with local AI.
|