What happens when rational coherence is not assumed, in the development of creationist views?
No child is able to make their epistemic set maximally coherent, and so it is likely that they will acquire a number of mutually inconsistent epistemic values and principles. If your parent tells you to try and see if things work out on the one hand, and that you need not do anything but believe the pastor or Bible on the other, this does not register for most young children as a conflict.
Young learners are natively active explorers and experimenters to some degree, but this doesn’t immediately translate to being scientific in their endeavors. To be scientific requires more than a trial and error approach before it can exclude non-scientific commitments. So as they develop, the learner can acquire distinct sets of views. At some point, what has been called “cognitive dissonance”, or the internal conflict between two divergent cognitions (Festinger 1957), may force the learner to fall in more completely with one or the other set, depending what is more strongly embedded. This is akin to a “conversion experience” in religion – most of those who convert do so because the offered choices more closely match one set of prior developmentally acquired epistemic commitments, forcing the revision or elimination of many of the other sets.
The learner in this case is only partially rational, in that they can have some degree of mutual contradiction in their belief sets. Of course, under the bounded rationality view, something like this is all but inevitable. If we do not have time to unboundedly consider our epistemic commitments, then neither do we have time to unboundedly work through the ramifications of a largish set of view, either. To represent this, let us return to the first figure, now amending it to indicate actual, rather than potential, epistemic development in a single learner.
Figure 2: When cognitive dissonance exceeds the individual learner’s tolerance limit, then the least deeply embedded of the divergent epistemic sets will be extinguished, forcing a radical revision of prior solutions.
In this case our learner must make a choice between two mutually exclusive rational sets of epistemic principles and cognitive items. It is more likely in terms of bounded rationality that the earlier views will be more heavily weighted, but as noted above this will also depend upon the force of the social cues upon which that part of the trajectory is based. If more people tell you to rely on the Bible than upon the evidence of observation and experiment, then even if you were early on exposed to empirical reasoning and experiment, the Bible may yet outweigh experience.
So crucially a rational choice depends on what sort of exposure one has to scientific matters. I think that direct experience, in which the learner is learning by their own activities of experiment and observation, is sufficient to overcome the social cues of authorities. Likewise, inductively similar results by others than closely matches the learner’s experience may be enough to overcome authorities. For example, if I do an experiment in the laboratory as a high school junior in which some reaction occurs, a report of other reactions that are sufficiently similar to the one[s] I did is enough to convince me of their veracity, no matter what the local alchemists may tell me.
All of this is based on the idealization that we are dealing solely with boundedly rational agents with limited resources and time. Of course, this is not true for even the most “rational” of people – humans have a predisposition to see things in anthropocentric terms, and individuals have psychological dispositions of their own. But it does appear to me that we can explain a lot about the reasons why people become anti-science if we take their conceptual development as a process of rational inference. What deviations from that occur must therefore be imputed to other aspects of human cognition, in particular the social psychological effects such as primacy (forming choices based on the first information encountered), availability (how easy it is to imagine or call to mind), and various forms of consensus enforcement such as groupthink.
In the next post, I will suggest what this might mean for educational and public policy strategies for responding to bad science and anti-science.
Festinger, Leon (1957), A theory of cognitive dissonance. Stanford, CA: Stanford University Press.