INSIGHTS

Designing better UX for AI — 8 best practices to follow

I analysed 150 interactions to identify the 8 best practices you should follow when designing the UX for AI, especially conversational AI.

Kshitij Agrawal · 7 min read · Sep 7, 2024

Collection of all Consumer AI — source: aiverse

What are best practices?

What are best practices?

What are some common ways companies are designing for AI? How are they making sure the solution caters to all types of users?

General patterns for AI-UX can be categorised into the following —


I like to call them the "Altitudes of AI Design". We looked at this in depth previously, when asking ourselves, "at what level are we designing for AI?"

The following 8 practices are a mix of solutions at different altitudes.

Let's dive in!

What are some common ways companies are designing for AI? How are they making sure the solution caters to all types of users?

General patterns for AI-UX can be categorised into the following —


I like to call them the "Altitudes of AI Design". We looked at this in depth previously, when asking ourselves, "at what level are we designing for AI?"

The following 8 practices are a mix of solutions at different altitudes.

Let's dive in!

1. Know when AI should enter

1. Know when AI should enter

All AI-UX interactions in Humane ai pin — source: aiverse

Overview

Overview

Everyone's slapping AI on their products like it's hot sauce at a taco truck. But here's the thing - discoverability is key. It's not just about throwing a sparkly AI button on your app and calling it a day. Although right now, you could probably get away with that. Slap an AI sticker on a potato and people would buy it. But what happens when the hype is over?

You want your AI to be the cool neighbour that brings over a six-pack when you need it, not the weirdo who keeps popping up uninvited. If you're designing for a non-conversational AI-UX, which I believe to be the future, the "smart robot" needs to be integrated in the user's workflow. Accelerating their tedious tasks, or pain-points exactly when they're about to cry for help.

Everyone's slapping AI on their products like it's hot sauce at a taco truck. But here's the thing - discoverability is key. It's not just about throwing a sparkly AI button on your app and calling it a day. Although right now, you could probably get away with that. Slap an AI sticker on a potato and people would buy it. But what happens when the hype is over?

You want your AI to be the cool neighbour that brings over a six-pack when you need it, not the weirdo who keeps popping up uninvited. If you're designing for a non-conversational AI-UX, which I believe to be the future, the "smart robot" needs to be integrated in the user's workflow. Accelerating their tedious tasks, or pain-points exactly when they're about to cry for help.

AI-UX interactions breakdown

AI-UX interactions breakdown

What are some examples of entry touchpoints in live products?

  • In Arc, when you're searching, the "Browse for me" button is an accelerator. Going from knowing what to search to getting the answer, no in-between process of finding the right resource.

  • In Apple's new writing tools, a separate button is the shown when user highlights the text — further giving them the option to edit the text based on pre-defined commands.

  • In Grammarly, there are different entry points based on where you are in your workflow. If you're writing an email, the discovery is with the cursor. If you've highlighted a block of text, AI is available right there. And if you're writing in a doc, the discoverability is tucked away in the corner of the page.

What are some examples of entry touchpoints in live products?

  • In Arc, when you're searching, the "Browse for me" button is an accelerator. Going from knowing what to search to getting the answer, no in-between process of finding the right resource.

  • In Apple's new writing tools, a separate button is the shown when user highlights the text — further giving them the option to edit the text based on pre-defined commands.

  • In Grammarly, there are different entry points based on where you are in your workflow. If you're writing an email, the discovery is with the cursor. If you've highlighted a block of text, AI is available right there. And if you're writing in a doc, the discoverability is tucked away in the corner of the page.

The above screenshots are from our "Trending AI-UX Patterns" insights ebook, with 50+ examples and each broken down by the best practices.

The above screenshots are from our "Trending AI-UX Patterns" insights ebook, with 50+ examples and each broken down by the best practices.

Live examples from our library

Live examples from our library

2. Transparency of "magical" moments

2. Transparency of "magical" moments

All AI-UX interactions in Humane ai pin — source: aiverse

Overview

Overview

You're sitting there, waiting for this magic robot brain to spit out some genius answer. But you've got no clue what's happening behind the curtain. Is it thinking? Is it confused? Is it secretly judging your terrible spelling?

Meanwhile, the AI is crunching numbers like a beast, but it has zero idea that you're over there tapping your foot, checking your watch, and wondering if you should've just used Google like a normal person.

And that gap right there? That's where the magic happens in AI user experience. That's the stuff that makes or breaks the whole interaction.

You're sitting there, waiting for this magic robot brain to spit out some genius answer. But you've got no clue what's happening behind the curtain. Is it thinking? Is it confused? Is it secretly judging your terrible spelling?

Meanwhile, the AI is crunching numbers like a beast, but it has zero idea that you're over there tapping your foot, checking your watch, and wondering if you should've just used Google like a normal person.

And that gap right there? That's where the magic happens in AI user experience. That's the stuff that makes or breaks the whole interaction.

AI-UX interactions breakdown

AI-UX interactions breakdown

What are some examples of entry touchpoints in live products?

  • In Arc Search, the resources that are being scanned for the answer are shown, instantly giving user the trust of what's happening and whether or not they can rely on the curated answer.

  • In Figma, a separate loading status is displayed and the components are at reduced opacity to signal that the information is being generated "here".

  • An extension to this, Elicit has subtle blinking bars to indicate the loading, while also maintaining the aesthetic of not being "in the face" as it mostly performs bulk actions. Subtle yet informative.

  • In Photomath, all the loading behind the screens happen with a "scanning the image" animation. It's instant and the user knows something it's taken the input that the user wanted.

What are some examples of entry touchpoints in live products?

  • In Arc Search, the resources that are being scanned for the answer are shown, instantly giving user the trust of what's happening and whether or not they can rely on the curated answer.

  • In Figma, a separate loading status is displayed and the components are at reduced opacity to signal that the information is being generated "here".

  • An extension to this, Elicit has subtle blinking bars to indicate the loading, while also maintaining the aesthetic of not being "in the face" as it mostly performs bulk actions. Subtle yet informative.

  • In Photomath, all the loading behind the screens happen with a "scanning the image" animation. It's instant and the user knows something it's taken the input that the user wanted.

The above screenshots are from our "Trending AI-UX Patterns" insights ebook, with 50+ examples and each broken down by the best practices.

The above screenshots are from our "Trending AI-UX Patterns" insights ebook, with 50+ examples and each broken down by the best practices.

Live examples from our library

Live examples from our library

3. Manage user's expectations

3. Manage user's expectations

All AI-UX interactions in Humane ai pin — source: aiverse

Overview

Overview

Setting the right expectations of what the AI can do is very important, whether it be by explicitly defining it, handling the error states to direct the user on the right path.

There are two ways of managing user's expectations
1. Upfront - Before the user even starts using AI, knowing the capability of AI would be helpful for
2. Error handling - If the user does write something that doesn't make sense, then handling those errors would be needed.

Setting the right expectations of what the AI can do is very important, whether it be by explicitly defining it, handling the error states to direct the user on the right path.

There are two ways of managing user's expectations
1. Upfront - Before the user even starts using AI, knowing the capability of AI would be helpful for
2. Error handling - If the user does write something that doesn't make sense, then handling those errors would be needed.

AI-UX interactions breakdown

AI-UX interactions breakdown

What are some examples of entry touchpoints in live products?

Upfront

  • In Arc Search, the resources that are being scanned for the answer are shown, instantly giving user the trust of what's happening and whether or not they can rely on the curated answer.

  • In Figma, a separate loading status is displayed and the components are at reduced opacity to signal that the information is being generated "here".


Error handling

  • An extension to this, Elicit has subtle blinking bars to indicate the loading, while also maintaining the aesthetic of not being "in the face" as it mostly performs bulk actions. Subtle yet informative.

  • In Photomath, all the loading behind the screens happen with a "scanning the image" animation. It's instant and the user knows something it's taken the input that the user wanted.

What are some examples of entry touchpoints in live products?

Upfront

  • In Arc Search, the resources that are being scanned for the answer are shown, instantly giving user the trust of what's happening and whether or not they can rely on the curated answer.

  • In Figma, a separate loading status is displayed and the components are at reduced opacity to signal that the information is being generated "here".


Error handling

  • An extension to this, Elicit has subtle blinking bars to indicate the loading, while also maintaining the aesthetic of not being "in the face" as it mostly performs bulk actions. Subtle yet informative.

  • In Photomath, all the loading behind the screens happen with a "scanning the image" animation. It's instant and the user knows something it's taken the input that the user wanted.

The above screenshots are from our "Trending AI-UX Patterns" insights ebook, with 50+ examples and each broken down by the best practices.

The above screenshots are from our "Trending AI-UX Patterns" insights ebook, with 50+ examples and each broken down by the best practices.

Live examples from our library

Live examples from our library

—— And that's it for the AIxDesign best practices, thank you for reading! ——

—— And that's it for consumer AI,
thank you for reading! ——

Want unique examples w/ best practices?

Want more deep dives

on AIxDesign?

Want more deep dives

on AIxDesign?

We launched our insights ebook (volume 1), a collection

of 50+ examples gathered from top products designing for AI.


Up your game!

We research the AI x Design industry,

break it down with a punch of humor.


In your inbox, once a month.

We research the AI x Design industry, break it down with a punch of humor.


In your inbox, once a month.

Get it now

More coming soon, get notified &,

Subscribe to the Voyager.

© 2024 Aiverse. All right reserved.

Designing for AI, Augmenting with AI