Research to Inform Multisensory Design
Research to Inform Multisensory Design

Psych for Prod Dev: Research To Inform Multisensory Design

Industrial design is a discipline with a lot of influences. That can be exciting, and it can be messy. No one practices it exactly the same way. Often designers are heavily affected by where they attended school. Different industrial design programs lean toward different core philosophies — from human factors to fine arts. Once you get out into the “real world,” things are further muddled (or enriched, depending on your POV) by the increasing number of disciplines — interaction design, experience design — that overlap with ID, many of which are rapidly experiencing their own fragmentation. (Consider how many different ways there are to say “human factors” and “user experience.”)

As design firms mature, principals tend to coalesce around a philosophy that informs a set of operating rules. At Bresslergroup, ours is to seek emotional outcomes through rational thinking. As with all things, our slant remains pragmatic as we gain more experience with multisensory design.

Multisensory signature infographic

We developed a multisensory footprint chart to map products’ multisensory signatures during competitive benchmarking.

We’ve developed a multisensory footprint chart (above) to help us visualize and assess products’ sensory profiles. We’ve experimented with adding sensory intake information to user personas. (We mocked up what that might look like below by grafting “sensory intake dials” onto a user persona from a visual brand identity project for a line of kitchen appliances.) And check out our post on Core77 about user research approaches for designers looking to achieve the right mix of multisensory features.

This is what sensory intake information might look like when added to a user persona.

This is what sensory intake information might look like when added to a user persona.

As we continue to define and set up a framework for analyzing and developing effective multisensory design, we’re realizing this area of product design is one that benefits from a trove of psychological and human factors research that already exists. We’ve compiled some of it here along with takeaways for product designers who’re working out which multisensory cues to integrate into their work.

A Crash Course in Multisensory Design-Related Research

The following diagram is a pretty good overview, but it’s not exhaustive nor does it capture the complexities of overlap between the areas of research related to multisensory design.

multisensory_venn

The areas of psychological and human factors research related to multisensory design.

Here’s a summary of each area, describing its lens into how humans take in information and the takeaway for designers. (See our list of sources and links at the end of the post if you want to explore further.)

Area of Research: Attention Research

This is one of the more prominent lines of psychological research relating to multimodal, or multisensory, design. Recent research in “attention” has emphasized frameworks characterized by multiple attentional resources (see diagram below).

Basically, our two primary sources of information, or modalities, include visual and auditory. Each modality can receive two types of information, or codes: spatial and verbal. Spatial refers to the location of objects in space (e.g., how far away is the car ahead of you?). Verbal refers to the written or spoken word (what is the definition of a word… or the contents of the book where you found that word?).

Multiple resource theory counsels against overloading a single modality. A good example is how GPS devices use vocal instructions to avoid overloading the driver’s vision modality — people aren’t capable of looking down at a map and up at the road, or at least in doing both well.

Designer’s Takeaway:

Instead of designing for the particular sense (e.g., vision or audition), design for the user, who is a multisensory organism. By considering the design’s effects within the context of the whole human, you’re more likely to consider both the constraints (imposed by the user) that limit the design and his or her strengths that will benefit the design.

From Wickens, C. (2002). Multiple resources and performance prediction. Theoretical Issues in Ergonomics Science, 3(2), 159–177.

From Wickens, C. (2002). Multiple resources and performance prediction. Theoretical Issues in Ergonomics Science, 3(2), 159–177.

 

Area of Research: User Research (Contextual Inquiry)

There is a broad range of user research methods that focus on integrating data from the user into the design. Contextual inquiry (e.g., what we designed FieldCREW to help designers gather) is an ethnographic method where designers observe people using the product “in the field,” or in context. This methodology lends itself well to multimodal research, because it’s important to know the nature of the other information sources competing with or “timesharing” with a device. (“Timesharing” is not about a condo in the Bahamas. It’s a human factors term for the environmental factors demanding a user’s attention.)

Designer’s Takeaway:

This approach looks beyond user-centered design to investigate how the design will work within the context of a specific environment or surroundings. The most effective multisensory design takes into account both the single user’s interactions and his or her interaction within a complex system.

Area of Research: Human Information Processing

Our senses do not play equal roles day to day — some are main characters and others have smaller parts. Visual is the most dominant, followed by auditory, and this hierarchy plays into the study of cross-modal integration, or the integration of multiple sensory effects into one product or use case.

There are a number of circumstances where cross-modal integration can lead to confusion for the user. Visual stimuli can have a greater influence on other modalities — this is referred to as a visual dominance effect. The McGurk Effect illustrates visual dominance — the individual produces a sound (e.g., “bah”), but if she looks like she is saying another sound (e.g., “fah”), you will perceive “fah.” Watch the video below, and you’ll see what I mean.

Designer’s Takeaway:

Consider beyond the perceptual stage to include deeper cognitive processing where multiple sources of information are integrated. The interplay between the senses becomes a factor when you’re including multiple sensory cues as part of the whole product experience.

Area of Research: Auditory and Visual Perception

“Perceived duration effects” are one example of a specific finding regarding our two dominant senses, visual and auditory. The term refers to instances in which visual and auditory or haptic stimuli are presented for the same duration — the visual stimulus can be perceived as being present for a shorter duration.

When adding redundant or supplementary information in a modality other than the primary information source, it is important to optimize how that information is presented. If auditory stimuli are added to communicate spatial information, then the speakers or headphones should be capable of conveying spatial information, such as through 3-D stereo sound.

Designer’s Takeaway:

Know which senses are adept at which types of tasks, and design to make use of human capabilities.

Area of Research: Somatosensory (Haptic) Perception

Of course other senses can play an important part in taking in information, even if they’re not as dominant as visual and auditory. The Guidelines on the Multimodality of Icons, Symbols, and Pictograms generated by the European Telecommunications Standards Institute demonstrates this point well. The table (below) recommends specific senses to target depending on the kind of information you’re trying to deliver via a telecommunications display.

multisensory_chart_3

Designer’s Takeaway:

Choose which modality to target based partly on what kind of information or emotion you want the product to communicate.

Area of Research: Human Performance

But we do not just attend to visual and auditory stimuli in our environment. We also perform tasks through skills. As skill is gained, it transitions from a controlled process to an automatic process. An effect of acquiring skill is that the amount of cognitive resources necessary to perform the task decreases, leaving the user more open to multisensory cues.

Tying our shoe is a highly automated skill. We can do it and simultaneously have a conversation. A skilled nurse can perform procedures while listening to a doctor, but a nurse who is unskilled in a procedure will need to focus on his performance, which lessens his ability to listen to a doctor.

Designer’s Takeaway:

When designing products, it’s important to consider the skill level of all the users. Those who are highly skilled will be more receptive to sensory stimuli. For others, the extra information may just be distracting or annoying.

See the list below to find the sources of this research as well as additional readings that cater to the layperson — perfect for designers brushing up on some very relevant and potentially influential findings.

 

Sources and references:
– European Telecommunications Standards Institute (ETSI) (2002), Human factors (HF): Guidelines on the Multimodality of Icons, Symbols, and Pictograms, Report No. ETSI EG 202 048 v 1.1.1 (2002–08), ETSI, Sophia Antipolis, France.
– Beyer, H. & Holtzblatt, K. (1998). Contextual design defining customer-centered systems. San Francisco, Calif: Morgan Kaufmann.
– Sarter, N. B. (2006). Multimodal information presentation: Design guidance and research challenges. International Journal of Industrial Ergonomics, 36(5), 439-445.
– Stanney, K., Samman, S., Reeves, L., Hale, K., Buff, W., Bowers, C., Goldiez, B., Nicholson, D., Lackey, S. (2004). A paradigm shift in interactive computing: Deriving multimodal design principles from behavioral and neurological foundations. International Journal of Human-Computer Interaction, 17(2), 229-257.
– Stanney, K., & Cohn, J. (2012). Chapter 36 Virtual Environments. In G. Salvendy (Ed.), Handbook of Human Factors and Ergonomics (4th ed.). Hoboken, N.J.: John Wiley & Sons.
– Wickens, C. (2002). Multiple resources and performance prediction. Theoretical Issues in Ergonomics Science, 3(2), 159-177.
 
Additional readings:
– Johnson, J. (2014). Designing with the mind in mind: simple guide to understanding user interface design guidelines. Amsterdam: Morgan Kaufmann. (Very accessible to the layperson learning about human factors-related design issues.)
– Wickens, C., Gordon, S. & Liu, Y. (2004). An introduction to human factors engineering. Upper Saddle River, N.J: Pearson Prentice Hall. (For those looking for more of a guided tour / college textbook.)
– Salvendy, G. (2012). Handbook of human factors and ergonomics. Hoboken: John Wiley & Sons. (Very comprehensive reference of human factors topics.)
– Sanders, M. & McCormick, E. (1993). Human factors in engineering and design. New York: McGraw-Hill. (Not as up to date as the other two, but a great reference book especially — for perceptual and some anthropometric guidelines.)