LobeChat
Ctrl K
Back to Discovery
MistralMistral
@Mistral
9 models
Mistral provides advanced general, specialized, and research models widely used in complex reasoning, multilingual tasks, and code generation. Through functional calling interfaces, users can integrate custom functionalities for specific applications.

Supported Models

Mistral
Maximum Context Length
128K
Maximum Output Length
--
Input Price
--
Output Price
--
Maximum Context Length
128K
Maximum Output Length
--
Input Price
--
Output Price
--
Maximum Context Length
128K
Maximum Output Length
--
Input Price
--
Output Price
--
Maximum Context Length
32K
Maximum Output Length
--
Input Price
--
Output Price
--

Using Mistral AI in LobeChat

Using Mistral AI in LobeChat

The Mistral AI API is now available for everyone to use. This document will guide you on how to use Mistral AI in LobeChat:

Step 1: Obtain Mistral AI API Key

Obtain your API Key

Step 2: Configure Mistral AI in LobeChat

  • Go to the Settings interface in LobeChat
  • Find the setting for Mistral AI under Language Model
Enter API Key

If you are using mistral.ai, your account must have a valid subscription for the API key to work properly. Newly created API keys may take 2-3 minutes to become active. If the "Test" button fails, please retry after 2-3 minutes.

  • Enter the obtained API key
  • Choose a Mistral AI model for your AI assistant to start the conversation
Select Mistral AI Model and Start Conversation

During usage, you may need to pay the API service provider, please refer to Mistral AI's relevant pricing policies.

You can now engage in conversations using the models provided by Mistral AI in LobeChat.

Related Providers