Home
People often make suboptimal judgments and decisions. Why? Is it because we go with our gut feelings and don't think hard enough about our options? Is it because we are biased by our feelings and emotions? Is it because we aren't adequately informed or lack some requisite skill (e.g. literacy, numeracy) necessary to evaluate critical information?
Research conducted in the Attitude and Decision Making Lab addresses each of these questions. Our focus on health decision making highlights the broader societal impact of these psychological processes and implications for the healthcare system. This research identifies and addresses decision making biases that contribute to poor health outcomes and wasted healthcare resources.
Read more about research conducted at the AADM Lab:
Director: Dr. Laura D. Scherer
I received my PhD in 2010 from Washington University in St. Louis, where I studied automatic biases and implicit evaluations. The year 2009 proved to be pivotal for my research and interests; this was the year that controversy ignited in reaction to a revised USPSTF recommendation to reduce breast cancer screening among women 40-49 at average risk. The recommendation was based on evidence showing that screening can do more harm than good among low-to-average risk women, and yet it sparked heated public backlash. It occurred to me that the psychological phenomena that I was studying—intuitive judgments, emotions, judgment bias—were likely contributing to the public's resistance to those recommendations. In a postdoctoral fellowship at the University of Michigan Center for Bioethics and Social Sciences in Medicine and Ann Arbor VA hospital I explored how affective and cognitive processes contribute to overuse of low value healthcare and the rejection of high value care.
This research has shown, for example, that often times people want medical care (e.g. cancer screening or treatment) even when it is likely to do more harm than good. This "action bias" seems like it is caused by people's faulty intuitions and tendency to "go with their gut," but my research found that when people are forced to think deeply about these decisions (i.e. not go with their gut) they make the same poor decisions with even greater confidence. More important for predicting medical decisions are things like diagnostic labels, which can lead people to desire treatment even when they have been told that treatment is unbeneficial. We have also found that people differ in their general approach to medicine, and some people are "medical maximizers" who prefer to receive medical interventions even when it's unnecessary, whereas other people are "medical minimizers" who try to avoid medical interventions whenever possible.
Today, research conducted in the AADM Lab continues to identify biases in medical decisions, as well as ways to address those biases. We also examine other kinds of judgments and decisions; for example, we are currently testing theory-driven interventions to try to reduce bias in political judgments. Perhaps most exciting, and sparked by our findings from the medical domain, we are conducting an extensive line of research to determine whether judgment errors that are typically thought to be "intuitive" (e.g. conjunction errors, responses on the Cognitive Reflection Test) are best understood as the result of reliance on intuition, or lack of cognitive skill (e.g. low numeracy, inability to avoid confirmation bias). This research speaks to the large and ever-growing literature on dual process theories of social judgment.