July 08, 2015 — Blog Post
Catch My Drift? Helping Patients Understand Risk
Understanding and communicating risk is notoriously hard. Most of us aren’t statisticians, and even if we were, we’d be human statisticians. We’re all subject to subtle cognitive biases that influence our judgment, along with all the emotions that accompany a health condition or procedure.
Fuzzy trace theory
If you’re involved in communicating about risk, I encourage you to check out “fuzzy trace theory” — and not only because it’s a fantastic name for a theory. (You can find an overview here, while this paper applies it to cancer decision-making.)
The idea behind fuzzy trace theory is, we remember and act on the gist of information, not the verbatim details we’re presented with.
Here’s an example. Have you ever done a risk assessment for something like heart disease or breast cancer? Tell me: what was your risk? Do you recall it being “really low,” “higher than average,” or “7.8%”? Many of you probably remembered a category of risk, not a precise number. That’s pretty common. We prefer simple and familiar information (like categories). Of course, we can and do evaluate more detailed information when needed, but it’s not where our minds immediately go or what we tend to act upon.
Numbers alone don’t cut it
Importantly, just because you have numbers in your head doesn’t mean you’ve interpreted them correctly: you might still be missing the gist. An example of this would be providing informed consent for a 2% risk of death. The gist of this, of course, is that you could die. So if you hear the risk as 2% and disregard it as “not gonna happen,” you’ve missed the gist of that number. On the other hand, if you incorrectly remember the risk as 10%, at least you correctly got the gist that, “hey, this is an actual risk I need to consider.”
Fuzzy trace also suggests that numbers aren’t as objective as they seem. We imbue experiential and emotional meaning onto them. If the forecast says there’s a 20% chance of rain, I’ll happily leave my umbrella at home. But if a surgeon tells me there’s a 20% chance of permanent nerve damage, I’m probably not getting any sleep that night.
What patients are hearing
This isn’t to say we should abandon attempts to communicate exact risk information, when appropriate. But we should keep in mind what patients are hearing. Better yet, rather than making assumptions about how they’re interpreting things, just ask. Think of it as “teach-back” for risk communication.
Try something like, “I’ve gone over a lot with you. What do you make of it?” See what they throw back at you. How are they filtering the info? What are they focusing on?
By listening for their gist, you can help them understand yours.