Enable the chat completions feature
Enable chat completions from your Meilisearch Cloud project in one of two ways:- Go to your project’s Settings page and enable it under Experimental features
- Or open the Chat tab in your project and activate the feature directly from there
For self-hosted instances, enable the feature through the experimental features API by sending a
PATCH request with chatCompletions set to true:Find your chat API key
Meilisearch automatically generates a “Default Chat API Key” that combineschatCompletions and search permissions on all indexes. Conversational search requires both actions: chatCompletions authorises the LLM call, and search authorises the retrieval step that feeds documents to the model. Any key you use with the /chats routes must carry both actions, so prefer the default chat API key unless you have a specific reason to create a custom one.
Check if you have the key using:
Restrict chat access to specific indexes
Chat queries only search the indexes that the API key can access. The default chat API key is scoped to all indexes. To limit which indexes a chat client can reach, you have two options:- Create a new API key with both
chatCompletionsandsearchactions, scoped to the exact indexes you want exposed. See manage API keys for the full workflow. - Generate a tenant token from the default chat API key. Tenant tokens inherit both the
chatCompletionsandsearchactions from their parent key and let you narrow index access or attach search rules per user.
A tenant token cannot grant access to an index its parent API key does not already cover. Make sure the parent key is scoped to every index the token should be allowed to reach.
Troubleshooting: Missing default chat API key
If your instance does not have a Default Chat API Key, create one manually:Configure your indexes
Configure the chat settings for each index you want to make available to the conversational search agent:descriptiontells the LLM what the index contains. A good description helps the agent decide which index to search and improves answer relevance. See optimize chat prompts for tips on writing effective descriptionsdocumentTemplateis a Liquid template that defines the text representation of each document sent to the LLM. Write it as natural language so the model can extract relevant information easily. Consult the document template best practices article for more guidancedocumentTemplateMaxBytessets a size limit on the text generated from the template. If the rendered text exceeds this limit, it is truncated. The default of 400 bytes balances context quality and speed
searchParameters to control how the LLM searches the index (hybrid search, result limits, sorting, etc.). See configure index chat settings for all available options.
Configure a workspace
A workspace holds your LLM provider configuration and system prompt. Each workspace can:- Connect to a different LLM provider (OpenAI, Azure OpenAI, Mistral, vLLM, or any OpenAI-compatible provider)
- Define its own system prompt and conversation context
- Access a specific set of indexes
/chat/completions call, not in the workspace settings.
On Meilisearch Cloud
Your project comes with a single default workspace namedcloud. Use cloud as the WORKSPACE_NAME in all API calls:
On self-hosted instances
You can create as many workspaces as you need. Choose any name forWORKSPACE_NAME — if the workspace does not exist, Meilisearch creates it automatically:
baseUrl is required for all providers except OpenAI. For OpenAI, it is optional and only needed if you are using a custom endpoint. See the workspace settings API reference for all available fields.
The prompts.system field gives the agent its baseline instructions. For guidance on writing effective prompts, see configure guardrails and optimize chat prompts.
Next steps
Your conversational search setup is complete. Choose how you want to use it:Build a chat interface
Create a multi-turn conversational interface where users ask follow-up questions.
Generate summarized answers
Display concise AI-generated answers alongside traditional search results.