mcp client langchain py
This server facilitates the invocation of AI models from providers like Anthropic, OpenAI, and Groq, enabling users to manage and configure large language model interactions seamlessly.
This server facilitates the invocation of AI models from providers like Anthropic, OpenAI, and Groq, enabling users to manage and configure large language model interactions seamlessly.
This simple Model Context Protocol (MCP) client demonstrates the use of MCP server tools by LangChain ReAct Agent.
It leverages a utility function convert_mcp_to_langchain_tools()
from
langchain_mcp_tools
.
This function handles parallel initialization of specified multiple MCP servers
and converts their available tools into a list of LangChain-compatible tools
(List[BaseTool]).
LLMs from Anthropic, OpenAI and Groq are currently supported.
A typescript version of this MCP client is available here
uv
(uvx
)
installed to run Python package-based MCP serversnpx
)
to run Node.js package-based MCP serversInstall dependencies:
make install
Setup API keys:
cp .env.template .env
.env
as needed..gitignore
is configured to ignore .env
to prevent accidental commits of the credentials.Configure LLM and MCP Servers settings llm_mcp_config.json5
as needed.
mcpServers
has been changed
to mcp_servers
to follow the snake_case convention
commonly used in JSON configuration files.${...}
notations
with the values of corresponding environment variables..env
file
and refer to them with ${...}
notation as needed.Run the app:
make start
It takes a while on the first run.
Run in verbose mode:
make start-v
See commandline options:
make start-h
At the prompt, you can simply press Enter to use example queries that perform MCP server tool invocations.
Example queries can be configured in llm_mcp_config.json5