Generative UI
Generative UI is the pattern where AI agents generate tailored interface widgets in real time, adapting the interface to a user's current intent and context.
Key points
- Google defines generative UI as agents generating tailored widgets in real time so the interface matches the user's specific interaction [src-038].
- The production challenge is avoiding demo-only UI generation: agents need to express UI intent without taking over the frontend's design system, rendering framework, or component ownership [src-038].
- A2UI's answer is portability: the agent describes what UI should appear, while the client renders that intent using web, mobile, or custom components [src-038].
- Example use cases in the post include a health companion that generates widgets for lab results, vaccine expirations, and clinic locations, plus a financial planner that creates sliders, charts, and multi-select controls around a user's goal [src-038].
- Generative UI is distinct from static dashboards because it surfaces the most relevant interaction at the moment of need rather than requiring users to navigate fixed menus [src-038].
- The ecosystem includes A2UI, AG-UI, MCP Apps, Open Generative UI, Vercel's json-renderer, Oracle Agent Spec, and agent frameworks that can speak A2UI through middleware [src-038].
- Stitch adds a design-time counterpart to runtime generative UI: AI generates high-fidelity interface directions and interactive flows from intent, canvas context, voice feedback, and design-system rules [src-040].
- Where A2UI focuses on portable UI intent for clients, Stitch focuses on the creative workflow that turns early product intent into screens and prototypes [src-040].
- The AI Engineer corpus adds product examples around AI canvases, whiteboards, browser agents, intelligent interfaces, dynamic products, generative layout, and agent-facing app surfaces [src-077].
- The practical design lesson is that AI UI is not only chat replacement. Interfaces increasingly need to expose agent state, generated artifacts, context, tool progress, confidence, and human override points [src-077].
Related entities
Related concepts
- A2UI
- Component Catalog Driven UI
- Visual Prototyping with AI
- Agentic AI
- AI Product Experimentation
- Stitch
- Vibe Design
- AI-Native Design Canvas
- Agent-Facing Apps
- AI Engineering Discipline
- Model Context Protocol (MCP)
Source references
- [src-038] Google A2UI Team — "A2UI v0.9: The New Standard for Portable, Framework-Agnostic Generative UI" (2026-04-17)
- [src-040] Rustin Banks — "Design UI using AI with Stitch from Google Labs" (2026-03-18)
- [src-077] AI Engineer channel transcript cluster (678 saved transcripts, 2023-10-20 to 2026-05-15)