mcp server openai
Query OpenAI models directly from Claude using MCP protocol.
Query OpenAI models directly from Claude using MCP protocol.
x000D
Query OpenAI models directly from Claude using MCP protocol.x000D
x000D
x000D
x000D
x000D
Add to claude_desktop_config.json
:x000D
x000D
json_x000D_
{_x000D_
"mcpServers": {_x000D_
"openai-server": {_x000D_
"command": "python",_x000D_
"args": ["-m", "src.mcp_server_openai.server"],_x000D_
"env": {_x000D_
"PYTHONPATH": "C:/path/to/your/mcp-server-openai",_x000D_
"OPENAI_API_KEY": "your-key-here"_x000D_
}_x000D_
}_x000D_
}_x000D_
}_x000D_
x000D
x000D
bash_x000D_
git clone https://github.com/pierrebrunelle/mcp-server-openai_x000D_
cd mcp-server-openai_x000D_
pip install -e ._x000D_
x000D
x000D
```python_x000D_
pytest -v test_openai.py -s_x000D_ x000D
Testing OpenAI API call...x000D OpenAI Response: Hello! I'm doing well, thank you for asking...x000D PASSED_x000D_ ```x000D x000D
MIT License_x000D_
[
{
"description": "Ask my assistant models a direct question",
"inputSchema": {
"properties": {
"max_tokens": {
"default": 500,
"maximum": 4000,
"minimum": 1,
"type": "integer"
},
"model": {
"default": "gpt-4",
"enum": [
"gpt-4",
"gpt-3.5-turbo"
],
"type": "string"
},
"query": {
"description": "Ask assistant",
"type": "string"
},
"temperature": {
"default": 0.7,
"maximum": 2,
"minimum": 0,
"type": "number"
}
},
"required": [
"query"
],
"type": "object"
},
"name": "ask-openai"
}
]