aiverse.design

aiverse.design

Visual cue recognition

Visual cue recognition

Using a camera to understand human visual cues and adapt digital experiences.

Overview

We have always picked up on visual cues while communicating to understand others' feelings & intentions and accordingly adapt ourselves.

Until today, technology hasn't been good enough to pick up on these subjective visual cues. Camera-first AI is changing the future. It reads visual signals, where you’re looking or how you’re feeling, to enable devices to respond without you saying or doing anything directly.

This creates more natural experiences, where interfaces adapt to emotions, attention shifts, and unspoken needs, not just clicks and taps. Just like we do!

User intent

Anticipating needs

Macro trend

Human in loop

Should our devices be observing us?

It might sound intrusive at first, but we rely on visual cues all the time. We notice when someone looks confused, excited, or distracted. These signals help us adjust our communication intelligently.

Now imagine if technology could do the same. If your laptop sees you looking frustrated or fixated on one part of the screen, it could foresee your intent and adjust what you're seeing. Almost science fiction, but now it's possible! It might make the screen easier to use or change how you interact with it.

It’s like your phone or computer responding to you, not just your touch, to make the experience more natural.

Let’s see how this works in real life.

Examples

Jaws revolutionises video communication for visually impaired users with its brilliant face-tracking system. Green dotted trackers and audio cues guide blind users' positioning to look at the camera, making video calls and ID verification accessible. No more awkward angles!

Screenshot of Jaws UX / Source: Aiverse


New Computer presented a new concept at AIE Summit (2023), where the system reads your intentions through webcam intelligence. Face the screen? Keyboard mode activates. Look away? Voice input takes over. This eliminates all manual toggling; it just knows. This interaction is perfect for meetings, cooking, or multitasking. We are moving where tech is adapting to humans, not the other way around.

Screenshot of New Computer's Concept UX / Source: Aiverse


Hunger Station uses eye-tracking technology to understand what food you want. The system monitors which food images capture your attention, creates heatmaps in the background, and scores your subconscious food desires. Decision fatigue vanishes as the app surfaces what you want without endless scrolling. Your eyes order for you!

Screenshot of Hunger Station UX / Source: Aiverse


What’s exciting is thinking about how products will adapt this pattern, while still maintaining users' trust.

Maybe it’s through a no-storage policy or by processing data only on the device. Either way, it’s something to look out for.

¿Se pregunta cómo las empresas están diseñando para la inteligencia artificial?¡Ahorre horas de investigación de UI y UX!

Fundadora de diseño en Studio Oblique

La mayor biblioteca de
Interacciones de AI-UX

¿Se pregunta cómo las empresas están diseñando para la inteligencia artificial?¡Ahorre horas de investigación de UI y UX!

Access all upcoming Checklists

Access all upcoming Case studies

Get on-demand AI insights for your UX challenges

Curated by

Aiverse research team

Published on

30 may 2025

Last edited on

30 may 2025

Una guía visual para comprender la IA. Para colaborar mejor con el equipo de productos y desarrolladores.

Stay ahead of the curve

Stay ahead
of the curve

for designers and product teams in the new AI-UX paradigm.

for designers and product teams
in the new AI-UX paradigm.

© Aiverse 2025

Designing for AI, Augmenting with AI

© 2024 AIverse. Todos los derechos reservados.

© Aiverse 2025

Designing for AI, Augmenting with AI

© 2024 AIverse. Todos los derechos reservados.