--- title: Using Groq API Key in LobeChat description: >- Learn how to obtain GroqCloud API keys and configure Groq in LobeChat for optimal performance. tags: - LPU Inference Engine - GroqCloud - LLAMA3 - Qwen2 - API keys - Web UI --- # Using Groq in LobeChat {'Using Groq's [LPU Inference Engine](https://wow.groq.com/news_press/groq-lpu-inference-engine-leads-in-first-independent-llm-benchmark/) has excelled in the latest independent Large Language Model (LLM) benchmark, redefining the standard for AI solutions with its remarkable speed and efficiency. By integrating LobeChat with Groq Cloud, you can now easily leverage Groq's technology to accelerate the operation of large language models in LobeChat. Groq's LPU Inference Engine achieved a sustained speed of 300 tokens per second in internal benchmark tests, and according to benchmark tests by ArtificialAnalysis.ai, Groq outperformed other providers in terms of throughput (241 tokens per second) and total time to receive 100 output tokens (0.8 seconds). This document will guide you on how to use Groq in LobeChat: ### Obtaining GroqCloud API Keys First, you need to obtain an API Key from the [GroqCloud Console](https://console.groq.com/). {'Get Create an API Key in the `API Keys` menu of the console. {'Save Safely store the key from the pop-up as it will only appear once. If you accidentally lose it, you will need to create a new key. ### Configure Groq in LobeChat You can find the Groq configuration option in `Settings` -> `Language Model`, where you can input the API Key you just obtained. {'Groq Next, select a Groq-supported model in the assistant's model options, and you can experience the powerful performance of Groq in LobeChat.