aiverse.design

aiverse.design

Granular editing

Granular editing

Enabling intuitive, human-style granular editing on AI generated responses.

Overview

Imagine if you could highlight or circle AI-generated output, just like with pen and paper, and give your suggestions there and then. This pattern mimics human behaviour and lets you quickly get AI’s help on specific sections of the response. You can make fast edits without re-inputting everything in words. It brings the ease of human-like editing into the digital world.

User intent

Getting result faster

Macro trend

Human in the loop

Let’s be honest: the most frustrating part of using AI is rewriting the entire prompt just to fix one section.

But what if you could simply mark the part you want to change, right where you're working, and the AI could instantly help? No more endless rewrites. Just point to the spot, and voilà.

Editing feels more human because now, it's always in your flow. It's fast, and intuitive.

Consumer-focused companies are integrating this pattern to increase adoption.

Here’s how products are blending our natural editing behaviours with intelligence -

Examples

AI can generate various types of responses, but textual and visual content typically requires the most frequent and iterative refinement. So let's dive deep into it.

1. Conversational & Documentation apps

We’ve grown up using highlighters, red pens, and margin notes - our brains are wired for critiquing and giving suggestions to make things better.

The challenge: replicating that same habit in digital apps. Using human behaviours and enabling users to critique specific lines.

AI-first products have solved for exactly this.

Lex.Page lets you highlight text and add AI suggestions there and then, instead of using a chat-like interface and describing "where" the problem lies.

Screenshot of Lex.page’s UX / Source: Aiverse

ChatGPT does the same—highlight any section, and it instantly becomes the context for your next edit.

Interaction of selecting a text for reference in Chat GPT / Source: Aiverse
Reference selected for next AI prompt - Chat GPT’s / Source: Aiverse


2. Creative tools

Next up: it gets smarter, and a lot more fun.

We as humans naturally like marking things visually. Sometimes, circling a flaw and sketching a quick correction feels easier than having to explain it in words. And it makes sense. It doesn't require the technical know-how, and follows the famous words "show, don't tell".

Now, products are replicating that same interaction.

It's AI as your creative partner, enabling you to edit easily without needing the technical design terms - like "padding", "glassmorphism", "texture", to make desired edits.

Photoshop took this one step further, allowing users can mark any area on the image to then add, remove, or tweak elements.

Selecting an area in Photoshop - Entry Point / Source: Aiverse
Prompting on selected area - Photoshop / Source: Aiverse
Variations based on prompt / Source: aiverse


3. Software creation apps

As designers, we’re used to spotting what’s wrong in a UI or a workflow and then critiquing it just by pointing.

You can select a component, flag a behaviour, or highlight a layout issue, and the LLM will tweak the UI/UX accordingly.

Vercel's V0 enables users you select any UI component and provide fixes without changing the rest of the design.

Entry point for pick & edit in Vercel / Source: aiverse
Prompting on selected prompt / Source: aiverse


"I've gotten a ton of value out of aiverse over the last year!"

Dave Brown, Head of AI/ML at Amazon

Unlock this pattern
instantly with PRO

Access the entire Pattern Library

Access all upcoming Checklists

Access all upcoming Case studies

Get on-demand AI insights for your UX challenges

Curated by

Aiverse research team

Published on

May 30, 2025

Last edited on

May 30, 2025

Insights in your inbox, monthly

Stay ahead of the curve

Stay ahead
of the curve

for designers and product teams in the new AI-UX paradigm.

for designers and product teams
in the new AI-UX paradigm.