The MCP Search Engine: The Future of AI Tool Discovery
The MCP Search Engine: The Future of AI Tool Discovery
The Model Context Protocol (MCP) is rapidly transforming how we interact with AI, but the next leap forward is just around the corner. Imagine a world where your AI assistant (let's call it Jarvis) can instantly discover and orchestrate the perfect set of tools for any task, without drowning in a sea of irrelevant options. This is the future of user interaction that people are working on now.
Let's break down how this will work, why it matters, and what it means for the future of intelligent user experiences.
The Problem: Tool Overload
As discussed in Solving Tool Overload: A Vision for Smarter AI Assistants, the more tools you add to your AI's toolbox, the harder it becomes for the AI to pick the right one. Too many tools can actually make the AI less effective, not more. We need a way for the AI to find the right tool at the right time—just like a human assistant would.
The Solution: Dynamic Tool Discovery
Here's the vision: instead of loading every possible tool into the AI's context up front, the client (Jarvis) queries a dedicated MCP Search Engine (currently yet to be built) to find only the most relevant tools for the user's request. This keeps the AI focused, efficient, and scalable—even as the ecosystem of MCP tools explodes.
Let's walk through how this works, step by step, using the following diagram:
Step 1: The User Makes a Request
It all starts with the user. You ask Jarvis to do something—maybe reserve a dinner table, pay back a friend, get workout advice, summarize a document, or automate a workflow. You don't need to know which tools exist or how they work. You just describe your goal in natural language.
Step 2: Jarvis Queries the MCP Search Engine
Instead of guessing which tools to use, Jarvis sends your request to the MCP Search Engine. This is a registry or index of all available MCP tools, designed to match your query with the most relevant capabilities. Think of it as Google, but for AI tools instead of websites.
Crucially, the MCP Search Engine doesn't just match tools to your request. It can also take into account your personal preferences, past usage, and context—drawing on what it already knows about you—to ensure the results are tailored just for you.
This is yet to be built. I expect this to be a highly competitive space with a variety of monetization models. Possibly whoever is supplying the Jarvis client will also be supplying the MCP Search Engine (similar to Google Chrome + Google, or Brave Browser + Brave Search).
For more on automatic discovery and server verification, see the ongoing discussion at modelcontextprotocol/discussions/84.
Step 3: The MCP Search Engine Responds
The MCP Search Engine analyzes your request and returns a curated list of tools that are best suited to help. This list is dynamic, context-aware, and can even take into account your preferences, past usage, or security requirements.
This means the tools Jarvis receives are not just relevant—they're also personalized to your needs and habits.
Step 4: Jarvis Sends the Query and Tools to the LLM
Armed with your request and the right set of tools, Jarvis now hands everything off to the LLM (Large Language Model). The LLM can see exactly which tools are available for this task, and can orchestrate them intelligently to deliver the best result.
Step 5: The LLM Responds with a Solution
The LLM uses the selected tools to solve your problem, then returns the answer to Jarvis. This could be anything from a simple response to a complex, multi-step workflow.
Step 6: Jarvis Delivers the Result
Finally, Jarvis presents the solution to you. The whole process is seamless, efficient, and tailored to your needs—no tool overload, no wasted effort.
Loop Back to Step 1
This continues until Jarvis is satisfied that the user's request has been fulfilled. Tools and resources can be kept up to date in the context for requests to the LLM. A smart Jarvis client makes use of the limited context window in the most efficient way possible to drive the LLM to the best action.
From there, the user can make another request if they like and the process repeats.
Parallels with a Human Assistant
This model closely mirrors how you might work with a highly competent human assistant. Imagine you hire someone to help you with a variety of tasks. You don't expect them to know how to do everything off the top of their head. Instead, when you give them a request, they might:
- Break down your problem into smaller steps
- Figure out what resources or tools they need for each step
- If they don't have what they need, they go out and find it—maybe by searching online, asking colleagues, or acquiring new skills
- If they get stuck, they come back to you for clarification or seek out additional resources
- They assemble just the right set of tools and information to get the job done efficiently
A smart AI assistant, powered by MCP and a search engine, can do the same: plan, orchestrate, and adapt dynamically, always focused on your goal.
Tool Search: From Simple to Sophisticated
How does the search for tools work? For small, focused workflows, it might be enough to just provide all available tools to the LLM and let it solve the problem. But as the ecosystem grows—imagine thousands or millions of tools—more advanced approaches are needed:
- Vector search: Find tools based on semantic similarity to the user's request
- Graph-based search: Map relationships between tools, workflows, and past successes
- Feedback loops: Use data from previous successful operations to improve future tool selection and ranking
The search engine can start simple and grow in sophistication as the ecosystem and user needs evolve. The goal is always the same: get the right tools, at the right time, for the right job.
Why This Matters
This approach solves the scalability problem at the heart of MCP. As the number of available tools grows, dynamic discovery ensures that only the right tools are ever in play. It also opens the door to:
- Personalized experiences: The search engine can learn your preferences and recommend tools you love.
- Security and trust: Some search engines can have "vetted tools" as a security feature to appeal to users and reduce risk of "tool poisoning" (see more).
- Faster innovation: Developers can publish new tools to the MCP registry (WIP), instantly making them discoverable by any client and the search engine can automatically add servers to the registry if a user mentions it in a conversation.
- A more open ecosystem: Competing registries and natural ranking systems (based on real user outcomes) keep the ecosystem healthy and user-focused.
The Road Ahead
As discussed in User-friendly just-in-time Auth with MCP and Please Make Jarvis, the future of AI is about more than just connecting tools—it's about orchestrating them in a way that feels natural, secure, and empowering for users.
Dynamic tool discovery is the missing piece that will make Jarvis-like assistants a reality. It's how we'll move beyond "LLM wrappers" (see more) and into a world where AI can truly do anything. By finding and using the right tools, at the right time, for the right user.
If you're excited about building this future, join the EpicAI newsletter and help shape the next era of intelligent user experiences.