Stop Designing UIs for AI - Let the LLM Decide What You See

DevOps Toolkit Series (Viktor Farcic)
DevOps Toolkit Series (Viktor Farcic)Mar 9, 2026

Why It Matters

Allowing LLMs to dictate UI composition transforms how enterprises deliver data‑rich experiences, reducing development overhead while enabling dynamic, context‑aware visualizations that adapt to unpredictable AI outputs.

Key Takeaways

  • Traditional UIs assume predictable data; AI outputs are unpredictable.
  • Two rendering approaches: server‑sent code (MCP apps) vs client‑side components.
  • Client‑side component libraries let LLM choose visual patterns dynamically.
  • Emerging standards (Google A2 UI, OpenAI JSON UI) favor data‑only responses.
  • Early adopters can build custom libraries now; standards still immature.

Summary

The video argues that conventional user interfaces, built for static data structures, are ill‑suited for the fluid, unpredictable outputs of large language models. Instead of pre‑defining dashboards or markdown layouts, developers should let the LLM dictate how information is presented, shifting UI design from human‑crafted templates to AI‑driven decisions.

Two technical paths are explored. The first, exemplified by MCP (Modular Component Platform) apps, ships full HTML/JS bundles from the server, allowing any agent to invoke a pre‑built mini‑app inside an iframe. While flexible, this approach yields a patchwork of inconsistent mini‑apps and raises security concerns. The second keeps rendering logic on the client: the LLM returns structured data with type hints, and a predefined component library on the front end renders diagrams, tables, cards, or charts as needed.

A live demo shows a Kubernetes‑cluster query first rendered as a dense markdown blob, then re‑imagined as an interactive diagram with collapsible sections and cards when the client‑side pattern library is used. The speaker cites emerging efforts such as Google’s A2 UI, OpenAI’s Open JSON UI, and Vercel’s JSON render, all converging on a declarative JSON‑first model where agents describe *what* to display rather than *how* to code it.

The implication for businesses is clear: future applications will increasingly delegate UI composition to LLMs, requiring robust component libraries and standards to maintain brand consistency and security. Early adopters can prototype these patterns now, but widespread adoption hinges on mature, interoperable specifications that balance flexibility with governance.

Original Description

Traditional UIs are built around predictable data structures—someone designs the dashboard, the tables, the charts ahead of time. But AI outputs are inherently unpredictable, changing in structure and format with every interaction. This video explores why conventional interfaces break down when AI enters the picture and examines two fundamentally different approaches to solving the problem. The first, MCP Apps, ships full HTML/JS/CSS rendering code from the server to be displayed in sandboxed iFrames—powerful but inconsistent across tools. The second, AI-driven data rendering, flips the model: the UI maintains a library of predefined visualization components (diagrams, tables, cards, charts), and the LLM decides which ones to use, returning only structured data with type hints rather than code or Markdown.
Through a live demonstration using a Kubernetes cluster, the video shows the stark difference between an AI agent cramming complex architecture information into ASCII tables in a terminal versus the same data rendered as interactive diagrams, styled tables, and information cards in a purpose-built Web UI. The key insight is that we're moving toward a future where LLMs don't just answer questions—they decide how answers should be visualized and how users should interact with them. Industry efforts like Google's A2UI, OpenAI's Open-JSON-UI, and Vercel's json-render all point in the same direction: agents output declarative data, and clients render it with their own component libraries. The future of UI design may not be designed by us at all.
#AIUserInterface #MCPApps #DevOpsAI
Consider joining the channel: https://www.youtube.com/c/devopstoolkit/join
▬▬▬▬▬▬ 🔗 Additional Info 🔗 ▬▬▬▬▬▬
🔗 DevOps AI Toolkit: https://devopstoolkit.ai
▬▬▬▬▬▬ 💰 Sponsorships 💰 ▬▬▬▬▬▬
If you are interested in sponsoring this channel, please visit https://devopstoolkit.live/sponsor for more information. Alternatively, feel free to contact me over Twitter or LinkedIn (see below).
▬▬▬▬▬▬ 👋 Contact me 👋 ▬▬▬▬▬▬
▬▬▬▬▬▬ 🚀 Other Channels 🚀 ▬▬▬▬▬▬
▬▬▬▬▬▬ ⏱ Timecodes ⏱ ▬▬▬▬▬▬
00:00 Web UIs for AI
01:10 Why Traditional UIs Fail AI
06:22 MCP Apps: Server-Side Rendering
09:01 AI-Driven Data Rendering
12:18 Future of AI User Interfaces

Comments

Want to join the conversation?

Loading comments...