Argo is a localized large model agent builder. Build agents with local & Cloud LLMs, RAG, MCP tools. Users can share these creations in our community, download AI agents from others. Key features: - Download opensource LLMs from ollama, huggingface or modelscope with one click. - Use local docs for RAG, sync with directories. - Support MCP tools. - Manage Agents with individual prompt, model, knowledge and MCP tools.
MindPal is a no-code platform for building and deploying AI agents and multi-agent workflows. It enables anyone without technical skills to create powerful AI automation solutions by connecting any AI model with any tool. Build complex workflows where multiple AI agents work together to accomplish tasks, with built-in support for MCP servers and tools. Key features: - No-code AI agent builder - Multi-agent workflow orchestration - Support for any AI model provider - MCP server integration - Visual workflow designer - Built-in tool marketplace
A tiny command-line interface chat application that brings AI conversations to your terminal. Features include chat data storage in JSONL files, interactive chat interface, support for multiple bot configurations compatible with OpenAI chat completion streaming format, Deepseek-r1 reasoning content support, and MCP client support with multiple server configurations.
Enconvo is your AI Agent Launcher that revolutionizes productivity. With instant access, automate your daily tasks effortlessly. Our intelligent AI Agent system, powered by 150+ built-in tools and MCP support, learns and adapts to your workflow. Experience seamless automation and enhanced productivity with the most versatile AI assistant for macOS.
VS Code integrates MCP with GitHub Copilot through agent mode, allowing direct interaction with MCP-provided tools within your agentic coding workflow. Configure servers in Claude Desktop, workspace or user settings, with guided MCP installation and secure handling of keys in input variables to avoid leaking hard-coded keys. Key Features: - Support for stdio and server-sent events (SSE) transport - Per-session selection of tools per agent session for optimal performance - Easy server debugging with restart commands and output logging - Tool calls with editable inputs and always-allow toggle - Integration with existing VS Code extension system to register MCP servers from extensions
Tome is an open source cross-platform desktop app designed for working with local LLMs and MCP servers. Tome manages your MCP servers so there's no fiddling with uv/npm or json files - connect it to Ollama, copy/paste some MCP servers, and chat with an MCP-powered model in seconds. Key features: - MCP servers are managed by Tome so there is no need to install uv or npm or configure JSON - Users can quickly add or remove MCP servers via UI - Any tool-supported local model on Ollama is compatible
A client that connects to any MCP server using Server-Sent Events (SSE) and displays conversations in a chat-like UI. It is a standalone Apify Actor for testing MCP servers over SSE, with support for Authorization headers. Built with plain JavaScript (old-school style) and hosted on Apify, it requires no setup to run. Key features: - Connects to any MCP server via Server-Sent Events (SSE). - Works with the Apify MCP Server to interact with one or more Apify Actors. - Dynamically utilizes tools based on context and user queries (if supported by the server). - Open-source—review, suggest improvements, or modify as needed.
Tambo is a platform for building custom chat experiences, with integrated custom user interface components.
✨ A Sleek and Powerful AI Chat Desktop Application ✨ SeekChat supports MCP tool execution, enabling AI to directly control your computer and perform various tasks. Easily automate file management, data analysis, code development, and more, turning AI into a truly intelligent assistant.
Superinterface is AI infrastructure and a developer platform to build in-app AI assistants with support for MCP, interactive components, client-side function calling and more. Key features: - Use tools from MCP servers in assistants embedded via React components or script tags - SSE transport support - Use any AI model from any AI provider (OpenAI, Anthropic, Ollama, others)
Add MCP Capabilities to Chatgpt, Gemini, Grok, Google AI Studio, DeepSeek, AI Studio, OpenRouter, T3 Chat, Mistral and Github Copilot. This extension allows you to connect to any MCP server and use its tools, models, and capabilities directly from your browser without any API key required in these Chat Platforms.