Just Expertise Alone Isn't Trustworthy
The US public consistently lacks high levels of trust in scientists. Scientists need to realize that science alone doesn't automatically yield public trust.
Do you “trust in science”? What does that question even mean?
I love the way Pew Research Center asks the question in their occasional “trust in groups and institutions” survey:
“How much confidence do you have in scientists to act in the best interests of the public — a great deal, a fair amount, or not too much/no confidence at all?”
I love that way of asking the question because it’s not the way science understands trust in science — which is, we trust the consistency and objectivity of science’s methods, that science will always be science, and that science being science is always in the best interests of the public.
Pew understands that “trust” for the public is something very different: Yes, great, you’re internally consistent. But can you show us you’re acting in all of our best interests?
In today’s low-trust society, aiming to cultivate just the first kind of trust seems…well, not very trustworthy. Pew’s data bear this out: Eighteen months ago, in its last “trust in groups and institutions” survey of the US public, Pew reported that only 29% of Americans had “a great deal of confidence” that scientists act in the best interests of the public — down from 39% when measured at two separate points in 2020. The “great deal of confidence” number has fluctuated for scientists since Pew started doing the survey in 2016, but somewhere between 55 and 79 percent of the American public surveyed by Pew has never had “a great deal of confidence” in scientists.
When it comes to numbers like these, science tends to skirt responsibility and blame its enemies. Well, misinformation and disinformation. And those low levels of public scientific literacy and critical thinking. Don’t forget politicization of science. There’s really nothing we can do about these algorithms and crazy politicians and poorly educated people. (I just finished reading a Nature article about science’s role in disinformation in which the author — a scientist — argued in successive paragraphs that we should teach everyone to be Bayesians and then, oh never mind, “as a species, humans have always been shockingly biased and gullible.”)
Insisting that “greater understanding of the sound processes of scientific knowledge production will lead you to trust science” is rather like your mail carrier lecturing you about the wonder of zip codes when that check is still in the mail six weeks after you were expecting it.
These are formidable challenges. But here’s another way to look at the problem: Post-pandemic, science has a lot of work to do on trust, when we accept that “public trust” means “the public trusts us to act in its best interests.” Science can no longer claim trustworthiness anymore simply because it’s expert, or expertly produced. Science is failing to acknowledge that _trust_ in our low-trust society is a relationship to be maintained, not a quality to be assumed. Insisting that “greater understanding of the sound processes of scientific knowledge production will lead you to trust science” is rather like your mail carrier lecturing you about the wonder of zip codes when that check is still in the mail six weeks after you were expecting it.
The trust economy today is focused on individual brands, not organizations or institutions. Trust in expertise today is trust in the expert. And effective public experts don’t just have research credibility, they also have trustworthiness — a quality that synthesizes essential abilities to cultivate public trust, such as
Translating their expertise into useful information for non-experts;
Applying that expertise to problems non-experts care about; and
Building credibility with the communities they wish to help.
Hypothesis: Public Experts Are Research Credibility + Trustworthiness
I gave a talk last week at the Urban Institute (many thanks to Dave Connell and Amy Elsbree of Urban for inviting me) in which I introduced the below 2x2, through which I’m attempting to illustrate the equal importance of trustworthiness to research credibility for the public expert (the expert who’s consistently effective at translating their expertise into terms both useful and compelling for a non-expert audience):
Along the x-axis: Your research credibility. Along the y-axis: Your trustworthiness — a function of a) your ability to translate your expertise and apply it profitably to problems and questions in the real world, b) how much your audience cares about what you’re talking about, and c) your credibility with the communities you want to engage and influence. None of the components of trustworthiness are functions of your research; they’re instead functions of your ability and willingness to be a bridge between your expertise and the desire of a community to address the issues it faces.
Experts, located in the bottom right quadrant, have high research credibility but low trustworthiness — not because they aren’t trustworthy as experts, but because trustworthiness is no longer a function of the level of expertise you possess. You become trustworthy by cultivating how your expertise serves a community. Which means trustworthiness is a value created by you and the community, not just yourself.
Public experts, who combine high research credibility and high trustworthiness in the top right quadrant, by definition have to have a strong connection with a community in order to have credibility with it. Expert communicators — the top left quadrant in the 2x2 — can actually cultivate high levels of trustworthiness with communities without having corresponding high levels of research credibility. (When I label this quadrant “expert communicators,” I’m referring to research communication experts as well as excellent communicators who have little or no grounding in evidence. When Tim Ferris — who I would put in this upper left quadrant — tells his audience something patently false, most of that audience still trusts him because he inhabits the markers of trustworthiness to them.)
The 2x2 is meant to be descriptive, not scientific. And I’m eager to hear from you how I might improve it. But I think it explains the 2020 to 2021 drop in trust in scientific expertise quite well — while expertise didn’t change during the pandemic (and in fact was communicated incessantly), the translational ability and (especially) credibility of certain communities of experts (public health and otherwise) fell dramatically with certain public communities. If you simply point to pandemic misinformation as the cause of the drop, you’ve also got to explain why overall confidence in scientists is so low.
Meanwhile, you’ve also got to explain why on Earth you would still be relying on your expertise alone to win you an audience in the low-trust environment we live in, instead of trying to also build your own trustworthiness.