MCP Server
Integrate LLM.kiwi with AI-powered IDEs using the Model Context Protocol (MCP).
What is MCP?
The Model Context Protocol (MCP) is a standard for connecting AI assistants to external tools and data sources. LLM.kiwi provides an MCP server that enables IDEs like Cursor, Windsurf, and VS Code to use our API directly.
MCP Endpoint
https://api.llm.kiwi/mcpAvailable Tools
chat_completionGenerate text using LLM.kiwi chat models
Parameters:
messages: arraymodel?: stringtemperature?: numberimage_generationGenerate images from text descriptions
Parameters:
prompt: stringsize?: stringaudio_transcriptionTranscribe audio files to text
Parameters:
file_url: stringlanguage?: stringIDE Configuration
Cursor
Add to your .cursor/mcp.json:
{
"mcpServers": {
"llm-kiwi": {
"url": "https://api.llm.kiwi/mcp",
"headers": {
"Authorization": "Bearer YOUR_API_KEY"
}
}
}
}VS Code (with MCP Extensions)
Add to your VS Code settings:
{
"mcpServers": {
"llm-kiwi": {
"url": "https://api.llm.kiwi/mcp",
"apiKey": "YOUR_API_KEY"
}
}
}Resources
The MCP server also exposes resources for AI agents to read:
- •
models://list- Available models and capabilities - •
docs://api- Full API documentation as markdown - •
usage://current- Current usage statistics
Example Usage
Once configured, you can use natural language in your IDE:
# In Cursor or MCP-enabled IDE chat:
"Use the llm-kiwi chat_completion tool to write a haiku about coding"