CFP: The democratic containment of fake news and bad beliefs

Submission deadline: July 15, 2023

Conference date(s):
October 26, 2023 - October 27, 2023

Go to the conference's page

Conference Venue:

Luiss Guido Carli University (Rome)
Roma, Italy


Call for Abstracts

The democratic containment of fake news and bad beliefs

 26th and 27th  October 2023

Luiss University Rome

Keynote Speakers:

Neil Levy (University of Oxford)

Cathrine Holst (University of Oslo)

Democratic theory typically assumes citizens as rational agents. However, there is a growing philosophical literature on the cognitive failures of the democratic public. Citizens may fail to identify reliable sources of information and fall prey to misinformation. This phenomenon has become evident over the last few years, when “fake news” went viral on social media. Moreover, citizens holding “bad beliefs” may resist fact-checking and debunking, keeping beliefs that conflict with available evidence. They may deny credibility to scientific expertise and scientific evidence. A clear example is the persistent denial of anthropogenic climate change.

To tackle the spread of misinformation in liberal democracies, several countermeasures have been invoked and increasingly applied, including institutional speech restrictions and social media platforms self-regulation. Such responses risk undermining citizens’ liberty and autonomy and being unduly paternalistic. Countermeasures to ensure competent democratic deliberation may raise similar worries about paternalism. Aside from those involving electoral reforms towards epistocracy, possible countermeasures may attribute an increasing weight to experts in policymaking. Indeed, experts’ advisory boards may counterbalance the failures in knowledge formation and reasoning among the electorate. Yet, despite their scientific competence, experts themselves are not completely immune to cognitive traps and biases. Thus, besides legitimacy worries, the role of experts in policymaking may also raise epistemic worries.

Epistemic failures among citizens of liberal democracies need to be tackled. However, normative theorists should provide guidance in ensuring that policy responses are compatible with the basic values of liberal democracies themselves, namely freedom, autonomy and equality.

We welcome submissions relevant to this research aim, broadly construed. Possible topics include, but are not limited to, the following questions:

  • What normative issues are raised by misinformation and cognitive distortions for democratic theory?
  • What are the policy implications of ascribing fake news, denialism and cognitive failures to motivated reasoning or the epistemic environment rather than irrationality?
  • Is fake news different from previous forms of misinformation when it comes to its effects on the agency of citizens?
  • Does vulnerability to fake news and cognitive distortions correlate with inequalities, including epistemic inequalities?
  • Does fake news have different impacts on different social groups?
  • How should fake news, climate change denialism and other cognitive failures of the democratic public be tackled?
  • How far are the various normative solutions that have been proposed compatible with the basic liberal democratic values of freedom, autonomy and equality?
  • Should liberal democratic governments rely more on experts’ advice to counterbalance misinformation, persistent denialism, self-deception and other cognitive failures among voters?
  • Which epistemic and moral worries do non-elected expert advisory bodies raise?
  • Which institutional measures can be taken to ensure the accountability of experts in policymaking?

Submission Details:

To apply for the conference, send an abstract prepared for blind review (max 500 words) to [email protected] by 15/07/2023Notification of acceptance will be sent by 30/07/2023.


This conference is part of the research project: Deceit and Self-Deception. How We Should Address Fake News and Other Cognitive Failures of the Democratic Public (

Supporting material

Add supporting material (slides, programs, etc.)