perplexity mcp server
A Node.js implementation that enables Claude to interact with Perplexity AI's language models through Anthropic's Model Context Protocol, providing tools for advanced chat completions and quick queries.
A Node.js implementation that enables Claude to interact with Perplexity AI's language models through Anthropic's Model Context Protocol, providing tools for advanced chat completions and quick queries.
The Perplexity MCP Server is a Node.js implementation of Anthropic's Model Context Protocol (MCP) that enables Claude to interact with Perplexity's language models. This server provides a secure bridge between Claude and Perplexity AI's capabilities, allowing for enhanced AI interactions through tool use.
The server currently implements two main tools:
Advanced chat completion tool with full message history support.
{
"name": "perplexity_chat",
"description": "Generate a chat completion using Perplexity AI",
"parameters": {
"model": "string (optional) - One of: llama-3.1-sonar-small-128k-online, llama-3.1-sonar-large-128k-online, llama-3.1-sonar-huge-128k-online",
"messages": "array of {role, content} objects - The conversation history",
"temperature": "number (optional) - Sampling temperature between 0-2"
}
}
Simplified single-query interface for quick questions.
{
"name": "perplexity_ask",
"description": "Send a simple query to Perplexity AI",
"parameters": {
"query": "string - The question or prompt to send",
"model": "string (optional) - One of: llama-3.1-sonar-small-128k-online, llama-3.1-sonar-large-128k-online, llama-3.1-sonar-huge-128k-online"
}
}
git clone https://github.com/yourusername/perplexity-mcp-server.git
cd perplexity-mcp-server
npm install
.env
file:PERPLEXITY_API_KEY=your-api-key-here
npm run build
To add this server to Claude Desktop, update your claude_desktop_config.json
:
{
"mcpServers": {
//more servers...
"perplexity": {
"command": "node",
"args": ["pathtoperplexity-mcp-serverdistindex.js"],
"env": {
"PERPLEXITY_API_KEY": "YOUR_PERPLEXITY_API_KEY"
}
}
//more servers...
}
}
The configuration file is typically located at:
%APPDATA%/Claude/config/claude_desktop_config.json
~/Library/Application Support/Claude/config/claude_desktop_config.json
~/.config/Claude/config/claude_desktop_config.json
Start the development server with automatic recompilation:
npm run dev
The server uses TypeScript and implements the MCP protocol using the @modelcontextprotocol/sdk
package.
PerplexityServer Class
Implements MCP server protocol
Manages error handling and server lifecycle
Tools System
@modelcontextprotocol/sdk
for MCP implementationThe server implements comprehensive error handling:
@modelcontextprotocol/sdk
: ^1.0.3dotenv
: ^16.4.7isomorphic-fetch
: ^3.0.0git checkout -b feature/amazing-feature
)git commit -m 'Add some amazing feature'
)git push origin feature/amazing-feature
)This project is licensed under the ISC License.
Common issues and solutions:
Server Not Found
Verify the path in claude_desktop_config.json
is correct
npm run build
)Check if Node.js is in your PATH
Authentication Errors
Verify your Perplexity API key in .env
Check if the API key has the required permissions
Tool Execution Errors
[
{
"description": "Generate a chat completion using Perplexity AI",
"inputSchema": {
"properties": {
"messages": {
"description": "Array of messages in the conversation",
"items": {
"properties": {
"content": {
"type": "string"
},
"role": {
"enum": [
"system",
"user",
"assistant"
],
"type": "string"
}
},
"required": [
"role",
"content"
],
"type": "object"
},
"type": "array"
},
"model": {
"description": "The model to use for completion",
"enum": [
"mixtral-8x7b-instruct",
"codellama-34b-instruct",
"sonar-small-chat",
"sonar-small-online"
],
"type": "string"
},
"temperature": {
"description": "Sampling temperature (0-2)",
"maximum": 2,
"minimum": 0,
"type": "number"
}
},
"required": [
"messages"
],
"type": "object"
},
"name": "perplexity_chat"
},
{
"description": "Send a simple query to Perplexity AI",
"inputSchema": {
"properties": {
"model": {
"description": "The model to use for completion",
"enum": [
"llama-3.1-sonar-small-128k-online",
"llama-3.1-sonar-large-128k-online",
"llama-3.1-sonar-huge-128k-online"
],
"type": "string"
},
"query": {
"description": "The question or prompt to send",
"type": "string"
}
},
"required": [
"query"
],
"type": "object"
},
"name": "perplexity_ask"
}
]