Building UI Without Context Is Slopwork
A practical take on making coding agents context-aware so they can build UI that actually fits your product

I got tired of AI coding agents that couldn't see my UI. They'd generate code without understanding my components, my design system, or my project's patterns.
So I built capy.
An AI UI context engine.
Why I Made It
I kept asking Claude to make a /preview page so I could actually see what the UI was about.
Buttons. Components. SVGs. Everything.
After doing that over and over, I tried building a better version as an MCP tool the AI could call directly.
It turned out to be... okay. Not the best, not what I originally planned. But it is barely doing its job right now.
The Solution
capy runs locally and exposes your codebase context through the MCP (Model Context Protocol).
It gives AI agents a window into your project, allowing them to understand your components, design system, and project structure.
What's Next
I am working on a much better system.
Soon I will release a SKILL.md for it and move to a more agentic CLI workflow.
(if you have been on the internet, you'd know team CLI won. iykyk)
How It Works
- Install capy as an MCP server
- Connect it to Claude Desktop, Claude Code, or any MCP client
- Ask AI to build or modify UI - it now has full context
- See previews of changes before committing
Get started at capy.anants.studio or star the repo at GitHub.