We flew to Chicago a few weeks ago to deliver three talks at the 2019 HFES Healthcare Symposium, including one on long-form contextual inquiry, called “Mind if I Awkwardly Watch You for Four Hours?,” and another, called “I Want To Learn in Context,” on virtual reality.
But our primary motives were to spend time with friends in the industry, meet new friends, and listen to everyone else’s talks — or at least as many as we could fit in. Also: deep dish pizza.
As usual, the HFES healthcare symposium was a fun and enlightening conference that manages to feel small, despite having the largest attendance ever this year. Bresslergroup wants to thank all the organizers, sponsors, and attendees for making this event a pleasure to attend.
All the tracks, even the ones not specifically geared toward medical devices, were relevant and interesting to us and our work in the healthcare industry. This year showcased some great design thinking and creative uses of technology in the medical space.
When we got back to our desks in Philadelphia, we realized we all still have a lot of ideas and inspiration rattling around in our heads after the symposium, so we got together to share them with each other, and now we’re sharing with you!
Our top thoughts coming away from the 2019 HFES Healthcare Symposium:
Virtual Reality Applications in User Research
Are there real, pragmatic applications for virtual reality in medical device human factors work? Definitely!
We loved hearing about applications of VR that come at human factors from the design input end, but we find it especially useful as a tool to share research insights. Alex and Conall spoke at the symposium about an effective and efficient way to use VR to add value to research data and reports, especially with the technology becoming more accessible and affordable.
We loved hearing about applications of VR that come at human factors from the design input end, but we find it especially useful as a tool to share research insights.
It was clear from the conference that human factors professionals are searching for creative applications for VR. While our own VR toolkit leans more toward cheap-and-cheerful, Dave and Larry from 219 Design gave a demo of a design review in AR/VR that leveraged more complex equipment and showed how you can quickly gain design insights using VR.
Their demo showed how we can give a better sense of scale, size, and complex design elements (like LED light patterns) by placing a model inside the manufactured space where it will ultimately live. We look forward to seeing how our two approaches might be combined to do design reviews within a more realistic real-world context (captured by a 360 camera).
It seems like VR can be especially useful during rapid insight testing or formative testing, because we can create a virtual design prototype and have real users interact with it in a virtual space to gather feedback. A VR prototype enables researchers to make adjustments more quickly, to allow for a more agile approach to upfront research explorations. Rather than waiting for a physical prototype to be made, we can digitally mock one up while providing a real-life context in a virtual space.
Hospitals and Human Factors: Perfect Together?
While many hospitals are starting to form human factors-aligned groups and there’s a strong pull from the medical side to include human factors, institutional factors can often confound the best intentions. Because of this, the consensus seems to be that progress will continue to happen in spurts — isolated initiatives rather than a clear, consistent slope up, until serious momentum is gained.
But evidence of a slowly-changing mindset can already be seen in how hospitals are communicating about clinicians and patients as partners who work together to co-produce good health outcomes. This paradigm shift from paternalistic to partner-oriented is leading to data-sharing innovations such as patient-owned EMRs (electronic medical records).
Evidence of a slowly-changing mindset can already be seen in how hospitals are communicating about clinicians and patients as partners who work together to co-produce good health outcomes.
The value of opening a dialogue between patients and clinicians was discovered inadvertently when patients in one hospital were given a “urine color comparison” bracelet to make it easier for them to describe their bathroom visits in a consistent way to their nurse (e.g., “I went 20 minutes ago and the color was, like, number 5 on this bracelet”).
While the color reporting was effective, the real value came from giving patients an appropriately simple tool to start a dialog with hospital staff about their bathroom use. Although the bracelets focused on the color of the urine, once implemented the hospital saw about two times the number of bathroom visits reported. It’s unlikely that the patients suddenly started urinating double, so one suspects that creating a talking opportunity made it more likely the event got tracked at all. The data itself was not as important as the conversations it sparked between patients and nurses.
On the flip side, human factors professionals are recognizing the cultural inertia of physicians and getting more savvy at integrating human factors processes into the ecosystem of hospitals, doctors, and nurses. At Bresslergroup we’ve found a couple of ways to do that — by hiring a doctor, and by forming a partnership with Thomas Jefferson University, one of our local hospital systems.
Mitigating Alarm Fatigue with Sensory Experiences
Several presenters shared study findings about cognitive overload and ways of dealing with it. The Audible Alarms track was particularly interesting since it’s a large issue in healthcare. Alarm fatigue happens when many alarms sound frequently and at similar frequencies, and soon become background noise. There are regulatory documents that suggest ways of creating alarm tones to alleviate this, but they are constantly being evolved as new issues surface.
The first presenter shared a study her team did about identifying where alarm fatigue comes from, and their main finding was that alarm fatigue is generally linked to fatigue. As medical professionals (doctors and nurses) grow more tired on the job, they become less likely to hear alarms and respond appropriately. Her talk described a direct link between “alarm fatigue” and general fatigue.
Part of the problem with alarm fatigue is that alarms get masked by other alarms. Matthew Bolton shared his findings from a study on testing simultaneous alarms. Three alarms were sounded in sequence (see the diagram, below, from Bolton’s presentation). Alarm 2, in the middle, gets masked by the two alarms around it. Participants were able to notice there was something different when the alarm was being masked, but couldn’t identify which alarm it was or what action it was telling them to take to respond to the alarm.
A few presenters discussed a way of designing audible alarms that uses the concept of auditory icons. Just as visual icons represent actions (e.g. a house represents navigating to the homepage), auditory icons follow the sound pattern or tone of the thing they’re alerting about (e.g. a lub dub for a heart rate that speeds up as heart rate increases).
There’s also the quite common situation where clinicians find themselves having to take their eyes off of a patient to look at the stats on a screen. One presenter, Joe Schlesinger, discussed the possible solution of using haptic feedback for alerts, rather than auditory or visual alarms. Haptic alarms would incorporate tactile patterns to elicit actions related to individual alarms (e.g. different vibrations for different alarms that increase or decrease as status changes).
UX and Agile in Medical Device Design: Not There Yet
It’s clear that medical device design is still heavily focused on safety and efficacy, which are absolutely critical. But we were a little surprised not to hear more about UX (user experience design) in healthcare. Though the acronym, “UX,” showed up in the program, it was typically only in the context of digital experiences. The discussion for physical devices was still mostly about safety and efficacy.
We anticipate that as medical devices increasingly move out of hospitals and doctors’ offices and into peoples’ homes, safety and efficacy will continue to be just as necessary, but no longer sufficient on its own.
Safety and efficacy are must-haves to get to market — but even the FDA acknowledges that once those needs are met, experiential outcomes like confidence and empowerment are what will help one device stand out from others like it. We were happy to see Rachel Aronchick and Erin Davis from Emergo by UL begin this thread as it relates to separating UI elements from HF elements in validation work. And we anticipate these conversations will go much further in the future.
Even the FDA acknowledges that once safety and efficacy are met, experiential outcomes like confidence and empowerment are what will help one device stand out from others like it.
In future symposiums, we expect the discussion of UX to go beyond safety and efficacy and into understanding the fear and anxiety that comes along with using certain medical devices and leveraging design to mitigate those feelings. We’re thinking specifically about one current project — a product we’re designing for teens to feel independent and empowered to administer an awkward treatment on their own without having to ask a caretaker. Maybe we’ll present it at a future symposium.
Another aspect of UX we anticipate seeing more of is agile approaches to medical device design. We loved the poster by James Parker and Ambika Chou, also from Emergo by UL. Because digital design is so fluid and physical design is fairly stepped (or waterfall), we anticipate this debate to increase as more digital-physical products show up in the med device industry. Balancing the two approaches and aligning them along FDA check points will be critical.
Similarities Between Fractures and Human Factors
At the beginning of the symposium, keynote lecturer, Richard Cook, gave a fascinating talk on the resilience of bones. What struck us most were the parallels he drew between fractures and human factors.
We take it for granted, but it’s remarkable that our body heals itself. Even after 30,000 years, humans are still simply pushing broken bones back into position to help them repair themselves. As we begin to learn what triggers the healing process, this intervention is evolving and we’re playing with the notion of sending chemical signals to the bones to help them repair themselves faster. This is a huge leap.
Cook likened human factors to that first level of bone repair — right now we’re more reactive than anything. We respond to incidents and try to reposition the system so it lands in a good spot. He sees a similar evolution for human factors where we start to mess with signals.
We’re excited by the parallel and by the possibilities!