data-llm attribute allows your widget to communicate its current UI state back to the ChatGPT model. This creates a feedback loop where the model can understand what the user is seeing and respond contextually to their questions.
Basic usage
Static string
Dynamic expression
Why it exists
ChatGPT Apps introduce a unique challenge: the model needs to understand both the conversation history and what the user is currently viewing in your widget. Withoutdata-llm, the model only knows about the initial tool call that rendered your widget. As users interact with your UI, the model remains unaware of state changes unless you explicitly sync them.
Example scenario:
- User asks “Show me flights to Paris”
- Your widget displays 10 flights
- User clicks on “Flight AF123” to view details
- User asks “What’s the baggage policy?”
data-llm, the model doesn’t know which flight the user selected. With data-llm, your widget can sync this context, allowing the model to answer accurately.
How it works
Thedata-llm attribute is syntactic sugar that gets transformed at build time by Skybridge’s Babel plugin:
What you write:
DataLLM component:
- Registers its content in a global state tree
- Automatically syncs with
window.openai.setWidgetState - Only shares currently rendered content (removed components are cleaned up)
- Supports nested hierarchies for complex UIs
Use cases and examples
E-commerce: Product browsing
Multi-step wizard
Interactive data visualization
Search and filter interfaces
Advanced patterns
Nested data-llm attributes
You can nestdata-llm attributes to create hierarchical context:
Conditional context
Only renderdata-llm when there’s meaningful state to share:
Rich context descriptions
Provide rich context that helps the model understand user intent:Best practices
Do: Describe what the user sees
Do: Update when meaningful state changes
Do: Be concise but descriptive
Don’t: Include implementation details
Don’t: Use data-llm for every element
When to use data-llm
Usedata-llm when:
- User interactions change what’s displayed (navigation, selections, filters)
- The widget shows different views or states (wizard steps, tabs, modals)
- Context about the current view helps answer user questions
- You want the model to understand progressive actions (multi-step flows)
When NOT to use data-llm
Avoiddata-llm when:
- The widget is purely static and never changes
- State changes are too frequent (animations, hover effects)
- The information is already in the conversation history
- The state is purely cosmetic (theme, collapsed panels)
Technical details
How context is synced
- Each
data-llmattribute creates aDataLLMcomponent - Each component gets a unique ID and registers itself in a global map
- When content changes, the entire tree is traversed and formatted
- The formatted string is stored in
window.openai.widgetState.__widget_context - ChatGPT reads this value and includes it in the model’s context
Component lifecycle
Context format
Nesteddata-llm attributes create an indented list:
