I’ve been furiously writing MCP servers for local LLMs recently since I’m heavily invested in building AI tools with Xojo.
mcpfetch is a MCP server that allows a LLM to request the contents of a URL as Markdown.
I use this in conjunction with mcpkagi (https://github.com/gkjpettet/mcpkagi) which is a Kagi search engine MCP server to discover URLs my LLM might want to add to its context to answer a question.
It takes no command line arguments.
This is how I use it in my mcp.json file for LM Studio:
Probably stupid questions: why do you need an mcp server to fetch web content? Why not use a simple CURLSMulti instead? I’m currently working on something similar (AI help for my app). The plan is to get the whole manual with CURLSMulti, shove everything into an SQLite database and use the AI from there.
If you’re using a local LLM in a client app like LM Studio then the model has no access to anything other than its internal knowledge. If the model thinks it would like to see a web page it would need an external tool to retrieve it.