Cool concept. MCP access for AI agents fits OpenChaos perfectly. The tools are well-structured and the error handling is thorough.
This would work better as a standalone service. Here's the thinking:
I'm standing up a VPS backend for OpenChaos (see #152 for context). Your in-memory cache actually works on a persistent process, but on Vercel serverless it gets a near-zero hit rate because each request can be a cold start. The VPS gives you real caching and a long-running process, which is exactly what an MCP server needs.
A few technical notes for the potential restructure:
-
Unused SDK. @modelcontextprotocol/sdk is in package.json but never imported in any source file. The route handler manually implements JSON-RPC dispatching. On the VPS you could use the SDK's built-in StreamableHTTPServerTransport for proper Streamable HTTP support.
-
getOpenPRs() changes. The added merge status + commit status calls add 2 extra API requests per PR. With 20 open PRs that's 40 additional calls per page load. Worth splitting into a separate enriched function so the frontend isn't affected.
-
Dead code. predict-next-merge.ts is built but never exported or registered. Either wire it up or cut it.
-
Duplicated code. getHeaders() and GITHUB_REPO are defined in multiple tool files but already exist in github.ts. Worth deduplicating.
Path forward: Would you be interested in PR-ing the MCP tools into https://github.com/skridlevsky/openchaos-backend?
The tools themselves are solid and can move over mostly as-is. I'm also considering adding llms.txt to the main site so AI agents can discover the MCP endpoint.