Trust, Pseudoscience & the Public Expert
Science is forever worrying and declaring that people are losing trust in it, even though levels of public trust (and lack thereof) in scientists and scientific institutions have remained remarkably steady since the early 1970s, at least in the United States, and this information is available to all the worriers/declarers.
So the worry is about something else — fear about a loss of support, most likely, and the vulnerability science feels being dependent on societies that don’t understand it. At least, that’s the context in which I think we should situate the findings of this new study in the Journal of Experimental Social Psychology, which found (generating much media coverage last week) that people who "trust" science are more vulnerable to belief in pseudoscience that is dressed up like science.
Blind worship of science is as unreliable a social foundation as distrust. The study’s authors admitted to Nieman Lab’s Denise-Marie Ordway that it’s hard for many in the public to grasp the complicated science behind vaccines or GMOs or 5G. So they suggest instead that people could “develop a type of scientific literacy known as methodological literacy” — an understanding of scientific methods and research designs, which could help them better distinguish between science and pseudoscience. Two of the authors also recommended that journalists should model methodological literacy in their science coverage to help cultivate a more critical approach. (Amusingly, Ordway’s Nieman Lab is the only instance I could find of the study itself actually being covered critically — the rest of the coverage all appeared to rip or paraphrase from the same press release.)
This suggestion — that people are the problem, and they should become more methodologically literate — reminds me of one solution to the Illusory Truth Effect. (ICYMI: The Illusory Truth Effect holds that the mere repetition of seemingly obviously false statements increases their persuasiveness truth, even among people who know them to be false. According to one new study, it kicks in with just five repetitions of false statements such as “the Earth is a perfect square” and “elephants run faster than cheetahs.”) One paper tells us we can overcome the Illusory Trust Effect by researching the accuracy of new information before we encounter it a second time.
Never mind the time — who has the responsiveness to become a just-in-time semi-professional “fact checker”? The problem with most solutions to pseudoscience is that, far from encouraging more critical thinking, they put the onus on ordinary people to become...pseudoscientists.
Which brings us back to “trust.” I find it telling how pejoratively it was defined in the coverage last week — akin to “blind faith.” The assumption is that we can somehow operate free of trust in scientists while relying on our own application of scientific methodology. But of course we must trust scientists, as we must trust any specialist whose services and counsel we rely on, in order to live productively. Even scientists have to trust other scientists, because, as the Harvard infectious disease epidemiologist Marc Lipsitch put it in this thread, “having a truly informed opinion on most matters is beyond our time constraints.”
Takeaway: As a strategic advisor to public experts, I don’t find most discussions around “trust in science” very useful. That’s because the public’s trust in science doesn’t overlap terribly much with a particular community’s trust in you as a public expert. “Science” already has “the public’s trust”: That's table stakes. Your work as a public expert — to find your particular communities, figure out what their heuristics of public expertise are, and occupy those heuristics as you cultivate their trust — is just beginning.