Using MCP Sampling in VSCode (Insiders)
Hey friends! I’m super excited to share some awesome news: VS Code Insiders now supports the entire Model Context Protocol (MCP) spec. That means you get not just tools, prompts, and resources, but also roots and (🥁 drumroll) sampling! 🎉
If you’ve been following along, you know sampling has always been a bit of a “just trust me, it works” kind of thing. But now, we can actually see it in action, play with it, and use it in real projects. This is a big deal!
Real-World Example: Journaling with MCP
In my MCP Fundamentals workshop, we build a journaling app that’s accessible only through MCP. It’s a great playground for these new features. We’ve got tools for creating entries, managing tags, and more. But here’s where it gets really cool: instead of manually adding tags to your entries, the LLM can now generate tags for you automatically using sampling.
Here’s how it works:
- When you create a journal entry, we make a sampling request to the client, asking it to generate recommended tags.
- We use
void
instead ofawait
so the user doesn’t have to wait for the sampling request to finish. The entry is created instantly, and the tag generation happens in the background. - The sampling request includes a system prompt and all the context needed to generate the right tags—either an empty array (if no tags are needed), references to existing tags, or recommendations for new ones.
Imagine in the future, with the elicitation feature (currently in the MCP draft spec), you could even approve or reject the LLM’s tag suggestions. That’s going to be a fantastic enhancement!
The User Experience
Let’s walk through what this looks like:
- You ask the LLM to write a journal entry about a trip to the beach and insert it into the database using the MCP server.
- The entry is created—content, mood, location, weather—all there, but no tags yet.
- Instantly, you get a request: “Hey, the MCP server wants to make a language model call to generate tags. Allow?” (It’d be nice to see the exact prompt, but for now, you just approve it.)
- The sampling happens in the background. You don’t have to wait.
- Later, you can ask, “What tags were applied to this entry?” Boom: beach, nature, relaxation, sunset. Perfect!
You can even peek behind the scenes to see the sampling requests and outputs. It’s all transparent and super handy for debugging or just satisfying your curiosity.
Why This Matters
This is a huge leap forward. Now, you can borrow the user’s LLM to accomplish all sorts of completions—like generating tags, summaries, or anything else you can dream up. And it’s all built right into VS Code Insiders.
I expect this feature to land in other clients soon, so keep your eyes peeled. In the meantime, go try out sampling, have fun, and if you want to go deep, join me in the MCP Fundamentals workshop!
Happy sampling! 🚀