Deep: The UX of AI Assistants
Your ultimate guide to how top companies are designing the latest generation of AI Assistants
🔒DoP Deep goes deeper into the concepts and ideas that are covered in the Weekly Briefing to help you learn lessons from the experiences of top tech companies. If you’d like to upgrade to receive these in-depth pieces of analysis you can upgrade below. New reports added every month.
When AI Assistants first started shipping, every product design team seemed determined to ship it with a sparkles ✨emoji.
Nobody’s quite sure where that trend started but as AI Assistants have matured over the past few months, some companies are ditching the sparkles and adopting different ways to visualise AI features altogether.
But sparkles aside, the UX of AI Assistants is a fascinating space for product teams that comes with a bunch of unique challenges. For Microsoft and Google, this means creating a unified conversational AI Assistant experience across multiple different sets of products. While other teams may decide to ignore the conversational interface altogether.
In this deep dive, we're going to explore how the UX of AI Assistants is evolving in 2025. With 20+ examples from companies including Lyft, Meta, Miro, Zoom and others, we'll take a look at AI assistants that work as standalone chat interfaces, in-product assistants that perform contextual actions, wizard-based workflows that guide users through complex tasks, and cross-platform solutions that aim to create consistent experiences across multiple products.
If you’re currently working on your own product’s AI Assistant or you’re just interested in staying up to date with how other companies are building theirs, this deep dive should hopefully help.
Coming up:
A breakdown of the UX of 21 companies’ AI Assistants including Lyft, Adobe, Amazon, Deepseek, Brave, Webflow, Uber and more
Microsoft vs Google's design approach: why Copilot's "message" vs Gemini's "ask" prompt text reveals their fundamental product philosophies
Why YC leader Tom Blomfield says “It's shocking to see how badly” one AI assistant works
How some companies are ditching chat interfaces entirely for guided multi-step workflows - real world examples for inspiration
The 20+ companies featured in full
What’s included in this Deep Dive UX report
Here’s how this deep dive is structured:
AI assistant name - some companies give their assistants names, like Amazon’s Rufus and Alexa. Others are simply called AI assistant or CompanyAI e.g. Notion AI.
Category - this deep dive includes both in-product AI assistants (assistants built directly into products to perform actions or help users) and standalone AI products. Some companies like Microsoft’s Copilot and Google’s Gemini do both, which brings its own challenges.
How it works - a brief description of what the AI assistant does and how this works.
UX patterns - every example has clearly marked UX patterns that are categorised - more on that below.
UI examples - downloadable examples of the UI including images and videos where necessary.
Default welcome text - understanding the language of how AI assistants interact with users is an important part of the overall UX. For each assistant featured, the welcome text language is shared - with some analysis on some of the most popular phrases used.
Link to more information - link to find out more about how the AI assistant works.
UX patterns
This analysis goes deep into the various UX patterns used for each of the assistants featured. There are 13 different UX patterns that are clearly marked for each product and these include UX patterns / elements such as:
Prompt suggestions - Pre-written or dynamically generated prompts that users can select or modify to interact with the AI, helping users start conversations or overcome writer’s block, making the tool more accessible and user-friendly.
Promo banners - promotional messages or advertisements displayed within the AI assistant’s interface, highlighting premium features, updates, or partnerships, such as promoting a Pro subscription or a new image generation feature.
Feedback buttons - such as thumbs up/down or star ratings, that allow users to rate or provide feedback on the AI’s responses, helping improve the assistant’s performance over time and showing users their input is valued.
Paste attachments - when large chunks of text is copied into some AI assistants, the text is pasted as an attachment - making it easier for users to continue to write the prompt they’re working on. Not all assistants do this, but we’ll highlight which ones do and which ones don’t.
Split screen - the screen is divided into multiple sections, allowing users to view and interact with different parts of the assistant simultaneously, such as a chat window on one side and generated output like code or images on the other, useful for multitasking or comparing content. Anthropic Claude uses a split screen UI when generating artifacts.
Wizard workflow - A step-by-step, guided process that walks users through complex tasks, such as setting up a project, generating a report, or creating a specific output, prompting users for details at each stage to ensure the output is accurate. LinkedIn’s hiring assistant, for example, uses this.
Contextual actions - dynamic, situation-specific actions or suggestions that appear based on the user’s input or conversation context, like offering options such as “Debug this code,” “Generate a visualization,” or “Explain this function,” to streamline interaction. In-product assistants come pre-built with actions they can perform on a users’ behalf.
A closer look at some examples with analysis and key takeaways
Let’s take a closer look at some examples together from specific companies . We’ll also explore some of the key trends and takeaways that you can use to inform the design of your own product’s AI Assistant.