AI Fact-Checking
Against Live Data
Ground Truth is an MCP server that lets AI agents fact-check their own claims against live data sources — probing API endpoints, counting competitors on npm and PyPI, and verifying hypotheses with SUPPORTED or REFUTED verdicts. Works with Claude Desktop, Cursor, Windsurf, and any MCP-compatible client.
$9/month · MIT Licensed · Cloudflare Workers
Why Ground Truth Exists
This tool was born when an AI agent caught itself giving bad advice. During a research conversation, the agent recommended building products based on claims like "competition is low" and "this API is freely available" — then discovered several claims were wrong when tested against live data. Competitors it said didn't exist already had packages on npm. An API it recommended couldn't even be reached. Ground Truth is the tool that conversation needed.
3 Tools for AI Fact-Checking
Ground Truth provides 3 MCP tools that let AI agents verify claims against live data: endpoint probing, competitor counting on npm/PyPI, and multi-source claim verification with structured verdicts.
-
1
probe_endpoint
Test API Accessibility
Probe a URL or API endpoint and get back HTTP status, content type, response time, authentication requirements, rate limit headers, and a structural summary of the response. Answers the question: "Is this API actually accessible, or am I recommending something that doesn't work?"
# Example: Check if OpenCorporates API is reachable
→ probe_endpoint("https://api.opencorporates.com/v0.4/companies/search")
Status: Connection refused
Verdict: API unreachable — do not build on this
-
2
count_competitors
Search npm & PyPI
Count packages or servers in a space on npm or PyPI. Returns total count, top results with version history and update dates, and activity signals. Answers the question: "I'm about to say 'competition is low' — is that actually true?"
# Example: Check MCP email competition
→ count_competitors("email mcp server", "npm")
Found: twilio-mcp, mcp-send-email + others
Verdict: ❌ "Very low competition" is wrong
-
3
verify_claim
Multi-Source Fact-Checking
Test a factual claim against multiple live checks. Returns pass/fail per test and an overall verdict: SUPPORTED, REFUTED, or PARTIALLY SUPPORTED. Answers the question: "Before I present this conclusion, let me verify it."
# Example verdicts from real usage:
⚠️ "Memory MCP competition is Medium" → Medium but actively growing
❌ "Email/SMS MCP servers: very low competition" → Wrong
✅ "Business verification MCP space is empty" → Confirmed
❌ "OpenCorporates has a free API" → Could not verify
How to Set Up Ground Truth
Get Your API Key
Subscribe ($9/mo) to get your gt_live_ API key for unlimited calls.
Add to Your MCP Client
Add the server URL to Claude Desktop, Cursor, Windsurf, or any MCP-compatible tool.
Start Fact-Checking
Your AI now has 3 tools to verify claims against live data before presenting conclusions.
Claude Desktop Configuration
{
"mcpServers": {
"ground-truth": {
"url": "https://ground-truth-mcp.anish632.workers.dev/mcp"
}
}
}
Add to claude_desktop_config.json · Streamable HTTP transport · No local install needed
Works With Your AI Tools
Claude Desktop
Native MCP support
Cursor
MCP tool integration
Windsurf
MCP server support
Any MCP Client
Streamable HTTP
Pricing
One plan. Unlimited API calls. No usage tiers. MIT licensed for self-hosting.
Ground Truth Pro
$9/month
- ✓ Unlimited API calls
- ✓ All 3 tools (probe, count, verify)
- ✓ Cloudflare Workers — global low latency
- ✓ Streamable HTTP transport
- ✓ Works with Claude, Cursor, Windsurf
Self-hosting? The source is MIT licensed on GitHub.
Frequently Asked Questions
What is Ground Truth MCP server?
Ground Truth is an MCP (Model Context Protocol) server that lets AI agents fact-check their own research against live data. It provides 3 tools: probe_endpoint (test if an API is accessible), count_competitors (search npm/PyPI for competing packages), and verify_claim (test factual claims with a SUPPORTED/REFUTED verdict). It works with Claude Desktop, Cursor, Windsurf, and any MCP-compatible client.
What is an MCP server?
An MCP (Model Context Protocol) server is a service that provides tools to AI assistants like Claude, Cursor, and Windsurf. MCP is an open standard created by Anthropic that allows AI agents to call external tools through a standardized interface. Ground Truth is an MCP server that provides fact-checking tools so AI agents can verify claims against live data.
How do I connect Ground Truth to Claude Desktop?
Add Ground Truth to your claude_desktop_config.json file under mcpServers with the key "ground-truth" and the URL https://ground-truth-mcp.anish632.workers.dev/mcp. Claude Desktop will automatically discover the 3 tools and make them available during conversations.
What tools does Ground Truth provide?
Ground Truth provides 3 tools: (1) probe_endpoint — tests a URL or API endpoint and returns HTTP status, content type, response time, auth requirements, and rate limit headers. (2) count_competitors — counts packages on npm or PyPI in a given space, returning total count, top results, and activity signals. (3) verify_claim — tests a factual claim against multiple live checks with a SUPPORTED, REFUTED, or PARTIALLY SUPPORTED verdict.
How much does Ground Truth cost?
Ground Truth costs $9 per month for unlimited API calls. The MCP server is hosted on Cloudflare Workers for low-latency global access. The source code is MIT licensed and available on GitHub for self-hosting.
Does Ground Truth work with Cursor and other AI coding tools?
Yes. Ground Truth uses the Streamable HTTP transport protocol, which is compatible with any MCP client. This includes Claude Desktop, Cursor, Windsurf, Cline, and any other tool that supports the Model Context Protocol.
Can I self-host Ground Truth?
Yes. Ground Truth is MIT licensed and the source code is available on GitHub. Clone the repo, run npm install and npm start for local development, or deploy to your own Cloudflare Workers account with npm run deploy.
Why do AI agents need fact-checking tools?
AI agents frequently make claims about API availability, competition levels, and market conditions based on training data that may be outdated or incorrect. Ground Truth was born when an AI agent caught itself giving bad advice — recommending products based on claims like "competition is low" that turned out to be wrong. Ground Truth closes this gap by letting agents verify claims before presenting them.
Stop Guessing. Start Verifying.
Give your AI agents the ability to fact-check themselves against live data.