Skip to content
AInew

Chat

Composable AI assistant surface. Three layout variants (drawer, popup, inline) share the same content slots and a status-driven Send/Stop button toggle. Provider-agnostic — wire it to CopilotKit, Vercel AI SDK, or your own backend by binding the Send/Stop events.

LlmChat
WCAG AA0 deps

Demo

Live PreviewReact

Interactive demo coming soon. See the code example above.

jsx
<LlmChat variant="drawer" open={open} onOpenChange={setOpen} status="idle">
  <LlmChatHeader>
    <LlmAvatar size="sm" name="AI" />
    <span>AI Assistant</span>
    <LlmBadge variant="success" size="sm">Online</LlmBadge>
  </LlmChatHeader>
  <LlmChatMessages>
    <LlmChatMessage role="assistant">Hi! How can I help today?</LlmChatMessage>
    <LlmChatMessage role="user">Show me a sorting function.</LlmChatMessage>
    <LlmChatMessage role="assistant">
      <LlmCodeBlock language="ts" code={snippet} />
    </LlmChatMessage>
  </LlmChatMessages>
  <LlmChatInput onSend={onSend} onStop={onStop} />
</LlmChat>

API (Angular)

PropTypeDefaultDescription
variant'drawer' | 'popup' | 'inline''drawer'Layout — slide-in drawer, floating bubble + popup window, or embedded inline card.
status'idle' | 'streaming' | 'error''idle'Connection / response status. Drives the input footer button (Send → Stop) and disables the textarea while streaming.
openbooleanfalseWhether the chat is open. Only used by drawer and popup variants. Two-way bindable.
onOpenChange(open: boolean) => voidReact: emitted when the open state should change. Vue uses v-model:open; Angular uses [(open)].

Composition parts

Chat is composed from the parts below. Each part is its own component — drop the ones you don't need.

LlmChatHeader

Title block (avatar / name / status badge) plus the close button. Slot-only. The close button is auto-hidden on the inline variant.

LlmChatMessages

Scrollable message list. Project LlmChatMessage, LlmChatTyping, and LlmChatSuggestion children inside.

LlmChatMessage

A single message bubble. Role drives alignment (user → right, primary fill; assistant/system → left, surface-sunken). Use failed for the error-state dashed border.

PropTypeDefaultDescription
role'user' | 'assistant' | 'system''assistant'Sender of the message — drives bubble color and alignment.
failedbooleanfalseMarks the message as failed (red dashed border).

LlmChatTyping

Three animated dots. Render at the end of the message list while the assistant is streaming. Respects prefers-reduced-motion.

LlmChatSuggestion

Tappable starter chip used in the empty state. Emits selected (Angular/Vue) or onSelected (React) with the label.

PropTypeDefaultDescription
labelstring— (required)Primary text of the chip.
hintstringOptional secondary text shown below the label.

LlmChatInput

Composable input footer. Wraps a textarea and renders a primary "Send" button — automatically swapped for a danger "Stop" button while the parent chat's status is "streaming".

PropTypeDefaultDescription
placeholderstringstatus-awareDefaults to "Message your AI assistant…", "Waiting for response…" while streaming, or "Try again…" on error.
valuestringOptional controlled value. Omit for uncontrolled internal state.
onSend(text: string) => voidFires on Enter (without Shift) or Send-button click.
onStop() => voidFires when the Stop button is clicked while streaming.

Accessibility

ARIA roledialog (drawer/popup), region (inline)

KeyAction
EscapeClose drawer or popup variant. Inline variant ignores Escape.
Tab / Shift+TabCycle focus inside the drawer (focus is trapped via CDK A11y / focus-trap equivalents).
Enter (in input)Send the message.
Shift+Enter (in input)Insert a newline without sending.
  • Drawer uses native <dialog> with aria-modal — same accessibility model as LlmDialog and LlmDrawer.
  • Streaming state announces via aria-live="polite" on the typing indicator so screen readers know the assistant is responding.
  • Stop button uses LlmButton variant="danger" so the destructive intent is communicated by both color and label.
  • Inline variant has no overlay chrome — the close button is hidden because there is nothing to close.