I recently stumbled upon a pair of complementary and contrasting articles which at first glance may appear to present opposing viewpoints. "Democracy is the best thing ever and we need more of it to have better science education !", says one. "No, no, democracy is the last thing you want in science at all !" says the other.
Actually, this is a gross oversimplification. Neither article really says that. Still, there does appear to be an inherent tension which goes to the heart of the democratic process : we need to hear different viewpoints to decide on the best decision, but the truth itself is independent of our opinions.
The first article is from the Philosopher's Beard blog, and I think it does a nice job of explaining why democracy is not a guide to truth - all while maintaining that we need democracy as a decision making-process :
Rational truths are those established by chains of human reasoning that can in principle be replicated by others. Science is an archetypal form of rational truth seeking since its authority depends on such replication: that one will always get the same result in the same experiment because the result doesn't depend on who the scientists are, but on the independently existing world.
Note, however, that rational truth does not work by persuasion, as opinions do in a democracy. Instead it rudely asserts that this is how things are whether you like it or not. The standing invitation to replicate the results of an experiment is not an invitation to have your own opinion about whether they are true.
To put it in a nutshell, it is right that in a democracy the people can debate and vote to decide whether murderers should be executed, but it is misguided to think the democratic process can also decide the factual issue of whether a particular person is a murderer. Democracy can determine whether to teach Intelligent Design in schools, but not whether it is true.
In essence there are different kinds of beliefs. There are those founded on scientific principles and objective evidence, and there are other sorts. Crucially, the other sorts are not invalid. This depends on context : you can objectively measure electrical charge and CO2 emissions, but you can't quantify anger or disgust. You can scientifically assess what will happen to person X if you apply action Y to them, but you cannot possibly scientifically establish whether that's a good thing or not.
This means that whether you want to use a democratic or scientific approach depends on what you're trying to accomplish, what sort of "knowledge" (for lack of a better word) you're trying to get at. If you're trying to establish knowledge about the world, go for science - compare your theories with measurable evidence, try more theories, repeat ad nauseum. Taking a vote makes exactly no difference. But if you're passing judgement on what policy to implement, if you're trying to juggle conflicting opinions on issues which can't be quantified, then yeah, a vote makes a good deal of sense.
Start applying democracy to science and things rapidly go very far south :
What actually results when [lay]people with different opinions on the science of climate change debate it in public is an increase in confusion, not enlightenment. The problem stems from an underlying confusion between the generally adequate liberal presumption of equal intellectual capacities (hence, freedom of ethical, religious, and political choice) and the faulty assumption that therefore everyone has the appropriate intellectual capabilities to assess particular truth claims.
To actually assess the truth of the matter - to learn from the debate in the Millian sense - presupposes that we have developed the capabilities to assess the truth of specific scientific claims in the first place. And that seems to require that we be scientists, and in fact specialists on the topics concerned, rather than ordinary citizens.
Not everyone can be an expert. Specialisation requires a great deal of time and effort, which is why we have them. Supposing you can skip all of this is an act of monumental hubris.
How are you as a non-scientist supposed to assess the scientific truth and significance of that [specialist] claim? Or of the mainstream scientific consensus ? If you do have an opinion on the matter, in what sense do you think it counts as knowledge ? If you don't believe what the climate scientists say, what makes you think you know better ? If you do believe them, do you consider that your belief in climate change is as well justified by an understanding of the relevant evidence as theirs ?
Just treating claims about the truth as contributions to the democratic market for ideas in the first place distorts their character and assessment. It suggests that we should treat such claims as opinions, and engage in constructive mutually respectful debate about them, as if they were of the same kind as other people's opinions about immigration reform or the Republican presidential nomination. Counting votes, though it may respect the equal dignity of every citizen, is irrelevant to the truth of scientific claims. Even if everyone based their vote on thorough internet research, a majority view that global warming is true has no more epistemic credibility than a majority view that it is false.
Since most people are manifestly unable to evaluate the science, we reach for non-science related grounds for judgement that we do feel proficient at....argument; the speaker's personal credibility; and emotional appeal. So it really matters how trustworthy we think the speakers are. We may not understand what they're saying, but if we think they're not being honest, we don't have to.
All this very much mirrors my own stance on science versus religion. I find it very strange that so many people on the internet insist that all religious people believe their holy texts are literally true. As far as I can tell, such critics are happy enough reading fiction and getting their own interpretation of it; no small few are even dedicated poets. So if you accept that you can have important truths embedded in abject fiction, why can't this be the case for religious texts too ?
Hint : it can. Mistakes are made both when the religious take their text as literal truths and when the non-religious insist that this is happening even when it isn't, or when a moral message is what they think it must be and no-one else has any right to any other interpretation (the popular but weird argument that you can't pick and choose, even though a work rife with contradiction, ambiguity and symbolism positively demands that the reader does so). Not for the first or last time will I say that interpretation is, unless the text is absolutely explicit, subjective.
So science provides us with facts and theories about how the world works, whereas other systems of thought inform us as to if and when this is a good thing or a bad thing. Likewise :
Democratic debates are limited to deciding what we should do as a society, not what individual members of a society should think. Thus, unlike in actual scientific debate, losing a political argument does not mean accepting that you are wrong, only that you haven't (yet) managed to persuade enough people to your view.
We can have no faith that the popularity of certain factual claims among people as ordinary as ourselves is any guide to their truth. Democracy is no more equipped to evaluate facts than rational truths... when it comes to the facts, neither the sincerity with which individuals believe that the holocaust is a myth nor the popularity of such beliefs can make them epistemically respectable. 90% of the population denying the holocaust is irrelevant to its truth status. And vice versa.
I would interject that science and democracy are not so well-separated as science and religion, however. There are elements of the democratic process to be found within science, especially at the messy forefront of research : having different ideas about what the data means, discussions on which model seems more plausible, knowing that the available facts (or which you choose as relevant and which you disregard) are subjective... this is vital. And even Plato conceded that dialogue and debate were absolutely essential to the search for truth :
Only when all of these things — names, definitions, and visual and other perceptions — have been rubbed against one another and tested, pupil and teacher asking and answering questions in good will and without envy — only then, when reason and knowledge are at the very extremity of human effort, can they illuminate the nature of any object.
In this sense, of contesting and discussing different ideas and contrasting alternatives, science has a democratic aspect to it. And of course, we do use actual votes in research councils and the like to decide on which projects should be awarded funding and given access to expensive facilities. With limited resources, there's no other option.
But the strength of a scientific consensus, the most prevalent opinion among experts as to what is true, is greatest when this it emerges naturally from many independent and competing voices. Collaboration is vital, but in the sense that we need to consider different ideas, so too is competition. Paradoxically, while you can assess the strength of a belief by finding by its prevalence in the scientific community, that's exactly what scientists have to strive to avoid when individually deciding what's true (or more probable). Science is hardly divorced from debate, but the procedure is very different from the usual democratic process.
In other words, there's a world of difference between holding an opinion poll among experts to measure and gauge what the existing consensus is (if there is any), and between holding a vote to set what it should be. Both involve counting, but that's where the similarity ends.
It's also a mistake to conflate "voting decision" with "correct decision" even in political arenas. You can vote for climate change to not exist but that does not make it so. You can vote to leave the EU, and it will happen, but that doesn't make it right. And yet as the article goes on, science and democracy form an essential partnership :
Truth and democracy are in tension, but nevertheless truth and democracy do belong together. Good public deliberation... requires a foundation of trustworthy true knowledge upon which we the people can construct sensible opinions of our own. Democracies thus require basic commonly held knowledge and on-demand access to trustworthy specific knowledge about how the world works.
It is for this reason that successful democracy requires setting up and protecting independent and non-democratic spaces and institutions - specialised epistemic communities with the authority to investigate truth. These are the real truth machines that are supposed to burrow after the truth wherever it may take them, and then report their findings back to the rest of us, who get to decide what to make of it.
The central problem for these truth machines is that they will always be in tension with the democratic values and politics of the society that set them up. Institutions like universities and courts are deeply anti-democratic and anti-individualistic. And that is how they are supposed to be. They try by various institutionalised organisational structures, values, and discipline-specific methodologies to assess ideas on the basis of their objective truth without regard to how agreeable they are to few or many people. They represent both a massive contradiction to and a necessary foundation for our shared liberal commitments about the pre-eminence of individual judgement, respect for the opinions of all, and a co-operatively determined political rule.
Not surprisingly they are in fact extremely vulnerable to political pressure because "the truth" has no intrinsic power to triumph in a democracy. Quite the reverse. Setting them up and maintaining them requires a self-binding political commitment from society, a collective agreement to place them outside the sphere of political contestation.
All of this means that democratic polities are in the uncomfortable position of voluntarily giving up their authority to decide what truth is, of setting up and actively supporting somewhat unaccountable truth machines that then proceed to tell us all sorts of things about ourselves and the world that we would rather not believe.
I like this very much. I think it's stupid to invoke the "democratisation of knowledge" as though letting everyone vote on everything will magically make us all experts. Expertise is acquired, not bestowed - you don't vote for surgeons and electricians. This is true for some non-scientific institutions as well : getting politicians to appoint judges is completely baffling to me. But, at the same time, to say that "democracy" is therefore useless is also stupid. No, democracy is an essential part of public life, but so are non-democratic elements. The key is to blend these systems together in a harmonious whole, not letting expertise degenerate into authoritarian diktat, nor plebiscite override or infringe upon expertise.
Demarking the limits and domains of each isn't always easy; the lines between "objective fact", which needs to be assessed through evidence, and "resulting policy" can be all too blurred. We've seen this repeatedly throughout the pandemic, with the phrase "follow the science" being raised like a talisman by people with very different ideas about what the science even says at all, let alone what it implies you should do about it. When you have to choose between awful options, it isn't always self-evident which one is preferable. Again, science can at best only inform you as to what each option will result in, not the morality of each choice. And pretending that morality is self-evident is dangerous in the extreme.
But it is at least possible in principle to establish these boundaries. Key to this, as the article says, is that we need to know the undemocratic elements are trustworthy. So how do we ensure this ? Here I think the second article from the Boston Review does a better job.
As the blog notes, having a common set of facts is essential : science itself is not democratic, but its findings must be open to all. But the review article goes a lot deeper. It begins with a lengthy rebuttal of the claim that the world would be a better place if only the darned public would just pay attention to scientists more : that is, that they make mistakes because they're ignorant of scientific findings and methodology alike. If only they knew more about climate change, the argument goes, they'd vote to take more decisive action on it.
The [knowledge-]deficit model has not fared well in the face of evidence over the last two decades. For starters, the approach simply does not reliably deliver the expected results of wider support and acceptance of science among the public. Despite concentrated efforts in science education and dissemination, periodic surveys on the public’s understanding of and attitudes toward science both in the United States and in the UK indicate little to no change in scientific literacy over the years. With respect to vaccines, in particular, interventions based on providing scientific evidence refuting vaccination myths have largely proven ineffective.
Though I think it's important to note at this juncture that experts remain highly trusted, though this varies significantly. This alone could undermine the article's claim that science needs to be more democratic. Initially, as I read, I was skeptical of this precisely for all the reasons given above - science requires specialisation which is inherently undemocratic. But one cannot draw black and white distinctions here. As I said, science does have the democratic aspects of discussion and debate. And as I've said previously, I think there should be an in-depth examination of how anti-vaccination began but failed in Britain but succeeded elsewhere.
What the review posits is that there's a trust deficit which is more important than the knowledge deficit. Of course, a knowledge deficit will always exist by definition - again, that's what specialisation means.
This “trust deficit” is not primarily fuelled by an epistemic concern — the perceived incompetence of scientists, say. Rather, public distrust is often animated by concerns over spurious interests — above all, monetary or political incentives that are perceived to compromise the reliability or legitimacy of scientific knowledge claims... while segments of the public may concede the competence of scientific experts — having the appropriate level of knowledge and skills related to a certain scientific area — they may simultaneously doubt their benevolence.
Which is why I've been at pains to present scientists as being just generally normal people, even if I myself have been likened to Ponder Stibbons by other scientists. But the above links back to the previous discussion : accepting something as true does not mean accepting it as a good or bad thing. And as the review makes clear, there are very good reasons for this indeed :
Multiple studies have shown how historical patterns of mistreatment — including medical experimentation without consent and exclusion of certain groups from clinical trials — help to explain why some communities remain deeply suspicious of scientific interventions.
Ultimately, trust entails vulnerability. If I trust you enough to let your input influence important decisions that I make about my life, I make myself vulnerable to you; I give you a certain amount of power over me. Health care decisions are especially risky in this regard, and the risk is only compounded for marginalized communities.
Certainly if you're on the sharp end of abominable practises like forced sterilisation, you have very good reasons indeed to distrust the medical community. It matters not one whit whether the researchers formulated correct conclusions based on their research if that research was done on you without your consent. Scientific knowledge itself has nothing much to do with opinions, but scientific practise definitely does.
All that said, the article misses the enormous caveat of POLITICS, which can transcend marginalised groups and make even majority groups unjustly distrustful of scientific findings. Again, the anti-vaccine movement needs to be examined : how did it succeed in aligning with US politics but fail in Britain ?
The review suggests that we need to move beyond the knowledge-deficit model and move from public education by scientists to public engagement with scientists :
One example of a framework for putting these ideas into practice is Horizon 2020, the European Union’s research and innovation funding program... which “requires all societal actors (researchers, citizens, policy makers, business, third sector organisations etc.) to work together during the whole research and innovation process.” In this scheme, science should be done with and for society; research and innovation should be the product of the joint efforts of scientists and citizens and should serve societal interests. To advance this goal, Horizon 2020 encouraged the adoption of dialogical engagement practices: those that establish two-way communication between experts and citizens at various stages of the scientific process (including in the design of scientific projects and planning of research priorities).
But implementation of this isn't easy, because nobody really know what it means.
Jack Stilgoe and colleagues likewise lament how the paradigm of public engagement has come to function as a “procedural” strategy to “gain trust for a predetermined approach,” leaving existing power structures intact. Taken together, this work suggests that the public engagement narrative has come to function more as “rhetoric” than reality.
I mean, what sort of dialogue are we supposed to have ? I see it as a basic necessity that the results of publicly-funded research should be made public, and ideally communicated as clearly and accessibly as possibly through outreach. But I am not at all sure what more can be done : if you were able to judge what sort of inquiry I could conduct, you'd be doing the research yourself. Or more bluntly, I don't tell you how to do your job. Why would I ?
However...
Philosophers Pierluigi Barrotta and Eleonora Montuschi have argued, science should itself be responsive to society: adopting a synergistic approach that allows different people to contribute with their diverse experiences and bodies of local knowledge would make it possible to raise and address new significant research questions, gather relevant data, and attain new knowledge. In a similar vein, science and technologies studies scholar Sheila Jasanoff recommends the adoption of “technologies of humility,” whereby stronger citizen participation should improve science governance in terms of accountability.
As historian of science Naomi Oreskes puts it in reference to climate change, scientists have a “sentinel responsibility to alert society to threats about which ordinary people have no other way of knowing.” Engaging with and involving different segments of society is fundamental for achieving a better understanding of the challenges faced by societies and developing research that is sensitive to these challenges and thus able to serve societal needs.
I am all for this. It's crucial to remember that scientists are also members of society; if we visit ivory towers for the sake of doing research, we don't live in them. Living is done in the real world, in the home, in the pub, in the cinema, at concerts, in the park, on the street. You can take the scientist out of society, but only while he's at work - you can't take society out of the scientist.
I would be opposed to having non-scientists decide what I get to research. It just wouldn't make any sense. But there's absolutely nothing wrong with asking them. There's nothing wrong at all with consultation. There are various way to do this, from one-one-one to citizen assemblies involved with research councils.
So no, democracy is not a truth machine, and science cannot be a democratic process. But while there are ways of greater societal engagement which would be damaging for science, like letting Joe Public decide who gets funding, it doesn't follow that the public must be excluded entirely. Done correctly, there are routes to engagement which would be enriching, not detrimental.