Interactive Cognitive Load

[Author: Bill Fischer]

Overview

Cognitive load is a particularly challenging aspect of designing interactive media. Navigation and layout systems depend on the user's working memory, executive function, fine motor skills, spatial organization, language appropriation, and sensory input filtering capabilities to sort out the menagerie before them. The design goal is to minimize the extraneous cognitive load and working memory required to gain access to information through the graphic user interface.

The I-See-U blueprint breaks this down into 3 user interaction events



Preamble

When a user is presented with a graphic user interface, there is a lot of cognitive work they need to do before taking action. How challenging this is can depend on where they may land on the neurodiverse spectrum, what type of environmental distractions are present, and which physical stresses they may be experiencing. This section recommends several methodologies for optimizing the preamble experience.

Epic+KCAD partnered with Protege Games and Innocademy Schools to imagine a virtual-reality, educational field trip called Amplify: Journey To Mars (external link). The concept integrates Microsoft Hololens AR with projected images and walky-talky style communications to create an immersive experience that students can engage with in small groups. The UI challenge involved isolating the interactive components (signal) from the immersive environment (noise).


Signal and Noise Management

Visual assets can be categorized in three ways: information, decoration, and distraction. 


Discoverability

"Discoverability' is a design method that calls for all possible actions to be visible in full view (not hidden). Working memory in our brains has limited capacity and is the system where we temporarily hold information available for processing things like navigation systems and page/screen content. Examples include:


Affordance

When users are provided a choice, it should be clear what the results of that choice will be and should be built on an expected, shared understanding.  This can include:



Action

Memory Optimization

A middle school child aim tablet, with an animation running on it, at a poster hanging the wall.

Epic+KCAD partnered with  the Grand Rapids Public Museum to develop a place-based, augmented reality, role-playing game in the Old Streets exhibit, called Old Streets Adventure (external link). There are no buttons in the app. The interface is controlled by the movement of the tablet alone. Users learn how to navigate through a feedback loop that is activated by moving the tablet. Interactions include: start, zoom, replay, and change focus.



Reaction

Feedback

Providing information that orients, and confirms the success or failure of the user's choices, is important because that can mirror the physical world, which provides constant feedback to our senses. That is the norm for most of our human experience. Digital interactive systems only provide it if the designer includes it. This can include:


Recovery

Undo, Redo, and Back are essential navigational elements for usability. Without them, users must return to 'home' pages/screens or navigation systems to undo an action.


How Blind and Sight-impaired Persons Understand the Visual World?

We need to keep in mind that the sight impaired create their own understanding of events happening on screen within the context of the way they experience the world through their available senses. And, that experience is rooted in sound, touch, smell, and taste. We only have sound to work with in video, so audio-storytelling that addresses as many senses as possible is key.

I have created shortcuts to segments inside of the videos to make it easier to revisit them, and also to provide a curated experience of the parts that I thought were most relevant to issues that are centered around how blind persons understand the visual world.

Types of blindness explained

How a blind person processes the visual world

Challenges of watching video as a blind person.


Communicating while deaf or hearing impaired 

Getting to know the various ways how that deaf and hard of hearing persons generally communicate with the hearing world can offer insights that we can synthesize into our communication-focused media design.

I have created shortcuts to segments inside of the videos to make it easier to revisit them, and also to provide a curated experience of the parts that I thought were most relevant to issues that are centered around communication.


Experiencing sensory inputs when neurodiverse

Understanding how neurodiverse persons experience sensory inputs can inform our design decisions for media that can deliver multimodes of input simultaneously.

I have created shortcuts to segments inside of the videos to make it easier to revisit them, and also to provide a curated experience of the parts that I thought were most relevant to issues that are centered around how neurodiverse persons react to sensory inputs.

Dylexia simulation

Watch the entire 2 minute video