Categories
philosophy

Evolution Of Truth Tracking

Introduction

Could there be evolution of truth tracking? Nozick claims on several occasions that natural selection would favour his picture of knowledge as truth tracking.

The two following citations are indicative of this element of Nozick’s position. They are taken from widely separate sections of his major work:

“evolution, which doth make trackers of us all, would select for global tracking rather than especially favouring the local version”

(R Nozick, Philosophical Explanations, Belknap Press of Harvard University Press, 1981, p. 194)

`there would be evolutionary selection for better capabilities to detect facts and to have true beliefs’

(ibid., p. 284)

The importance of this claim for Nozick rests on more than these remarks however. It would be a defect of his claims about knowledge if the picture thereof meant that we as evolved creatures could not reach it or approach it on any view incorporating the claim that possession of knowledge enhances evolutionary fitness.

At some level, this claim must be true. Any creature in possession of knowledge about benefits available in the environment such as food sources will outperform one not in possession of such knowledge. And any creature making dramatically inaccurate over-estimations of its own capacities will suffer the consequences. However, this essay will argue primarily from Dennett that the picture of complete knowledge being indisputably good for fitness is too simplistic.

Is More Knowledge More Fit?

This leads to the potential challenge to Nozick’s position. Evolution will only make us trackers if tracking provides knowledge and knowledge adds fitness. If the second claim fails, then even if tracking provides knowledge, we may not do it. If we do not do it, then either we need a different account of knowledge, or we accept that Nozick has described something we will struggle to reach and may not need to.

Nozick does have a defense however. I will show that avoiding false positives and false negatives are differentially important as a practical matter in different cases. Yet Nozick can concede this but ask what is the link from practical importance to relevance for epistemic assessment. Since this link cannot be provided, Nozick’s account is in fact safe from attack from evolutionary perspectives. I consider a further variant challenge based on the interaction between evolution and distances in possible world space. Again it seems that Nozick has an adequate response.

One final charge to press against Nozick might be to observe that if in some cases false belief is better for us, then the best way to obtain that might be via avoiding tracking thus: tracking itself could be unfit.

Evolution Of Truth Tracking: Nozick’s Account Of Knowledge

Nozick has a four-factor account of knowledge. It is an extension of the traditional tripartite model of justified true belief. Nozick’s account relies heavily on the subjunctive conditional. This refers to the connective in sentences like `if A were to be the case, then B would be the case’.

The following is the standard terminology.

  • p: the belief or proposition in question
  • s: the subject
  • B: two place operator preceded by a subscript indicating the subject and succeeded by a subscript indicating the belief
  • ☐→: subjunctive conditional connective
  • ¬: negation

Four Conditions For Knowledge

On this basis, Nozick’s four conditions for knowledge are as follows.

1. p
2. sBp
3. ¬ (1) ☐→ ¬ (2)
4. (1) ☐→ (2)

Condition (3) is known as Sensitivity. If s satisfies it, his beliefs are sensitive to p becoming false. He will no longer believe p in those possible worlds where p is false.

Condition (4) is known as Adherence. If s satisfies it, his beliefs adhere to p remaining true. He will continue to believe p in possible worlds where p is true.

The possible worlds analysis is primarily due to Lewis (D Lewis, Counterfactuals, Blackwell Publishing, 2001). I summarise it below.

Possible worlds are sets of propositions. The actual world is given by the set of propositions that are true in the actual world. Different possible worlds have a number of propositions that are true in them, and the closeness of two possible worlds may be considered as related to how many propositions share the same truth values in both worlds; and how dramatic the effects of the changes are.

Sensitivity And Adherence

Nozick believes that Sensitivity and Adherence are of equal importance. Zalabardo (J Zalabardo, UCL Dept. of Philosophy), forthcoming disagrees and claims that Sensitivity is more important that Adherence in that the former does more work and the latter introduces more problems than it solves. For Zalabardo, we can improve Nozick’s account by dropping the Adherence condition altogether. If an evolutionary account can be produced to show that Sensitivity is more likely to be selected for than Adherence, that would bear on this argument.

Nozick introduces Adherence to allow for a type of counterexample which might be termed `fortunate avoidance of misinformation’. The first example relies on the familiar skeptical problem of the brain in a vat being fed data indistinguishable from sensory input. If the experimenters doing this fed the brain the information that it was in fact a brain in a vat, we would not be tempted, according to Nozick, to allow it knowledge on the point. This is because the brain would not have formed such a belief in the close possible world where the experimenters do not feed in such knowledge. The brain was in some way “too lucky” to really have that knowledge.

We can handle a further counterexample due to Harman similarly. A man reads a newspaper recording that the dictator is dead. This is true, but the regime suppresses the information issuing denials in subsequent editions; the man in question fails to see these denials and so continues to believe the truth. Intuitively though, we are more likely to allow that this is a case of knowledge, after all the man has formed a belief which is true by a method which was in fact reliable, whether he knows it was or not.

Local Vs Global

It should be noted that Nozick’s reference to `local […] global’ in the quotation given in the first paragraph of the Introduction is irrelevant to our purposes here. He is using the terms temporally to restrict (via the term `local’) the range of times relevant to our assessment of whether s knows p.

If p includes `now’ then it could change its truth value later. But we might still be prepared to accept that s knows p = `it is raining now’ even if s ceases to know that later on because the rain stops. Nozick acknowledges he might need further conditionality to account for this but believes it can sit atop his system rather than replace it.

So in the quotation, Nozick is in fact claiming that natural selection would prefer durable or robust knowledge rather than some evanescent version; this seems clear and does not change the position in relation to whether evolution will require tracking. Importantly, he believes that evolution would favor tracking of both types even though it would preferentially select the global version.

Evolution Of Truth Tracking: False Belief

While prima facie it would seem to be evolutionarily beneficial always to have true beliefs (i.e. knowledge) there are in fact many situations wherein false belief provides a higher level of fitness. McKay and Dennett discuss several examples (R McKay and D Dennett, The Evolution of Misbelief, Behavioral and Brain Sciences (2009) 32, 493–561). One of them is item recognition in an outdoors setting, which will be the major example considered here, in the next subsection.

Other examples discussed include expectations of AIDS patients, false beliefs about the self and the placebo effect. I will outline these briefly. The citation given in the case of AIDS patients relates to the initial period of prevalence of the disease in the US, at a time prior to the development of treatments. Life expectancy in such patients was months. Contrary to received wisdom emphasizing acceptance, patients who ignored their HIV status and remained in denial experienced slower onset of symptoms and significantly lengthened survival.

Remarkably, it appears that optimal mental health is associated with delusionally false and positive beliefs about the self. That’s the case in relation particularly to one’s own capacities. It is well known that this is a frequent occurrence, and so on the type of argument made throughout the paper, it should have a benefit.

Everyone thinks they are a better driver than average and they make strong claims about the prowess of their children. All of these delusions may have elements of becoming self-fulfilling prophecies. Students who believe they will perform well in exams are less likely to succumb to performance anxiety and to remain focused on work which will of course improve such performance. Most ironically, people generally claim that they are less susceptible to self-delusion than others.

The Placebo Effect

The placebo effect is well-known and need not be considered in detail here. The belief that crystals will heal one can make that partially true. The mere presence of doctors and a medical setting can begin healing before any treatment is commenced.

Explanations of this include the idea that the immune system of the body may be compared to an army. Antibody cells are the soldiers. The body may throw the reserves into action when in a medical setting. This is less risky when reinforcements in the form of external treatments are available. If this account is true, it would be one example of a situation in which a mechanism could evolve to translate false belief into beneficial action, though naturally in an evolutionary setting we would need to replace `hospital’ by `safe environment’.

Psychological Benefits Of False Beliefs

Some explanations for the prevalence of religious beliefs focus on the psychological benefits.

In addition, this type of idea is also present in many other philosophers. Nietzsche:

“throughout immense stretches of time the intellect produced nothing but errors; some of them proved to be useful and preservative of the species: he who fell in with them, or inherited them, waged the battle for himself and his offspring with better success.”

Nietzsche, The Gay Science, Cambridge University Press, 2001, III S.10

By this, Nietzsche means for example the facts that there would have been no astronomy without astrology and no chemistry without alchemy. Ryle (G Ryle, The Concept of Mind, University of Chicago Press, 1949, p. 13) also writes of new more useful myths replacing older myths in physical science, giving the example of the concept of force replacing that of Final Causes — the improvement not deriving from veracity. Unger (P Unger, A Defense of Skepticism, The Philosophical Review, Vol. 80, No. 2 (Apr., 1971), pp. 198-219) considers a circumstance that `allows a false belief to be helpful.” It could happen when the alternative is to have no belief at all and some action is called for.

Error Asymmetry In Object Recognition

Absolute knowledge under all circumstances is indeed optimally evolutionarily fit, but this is not obtainable. Given an irreducible amount of error, it becomes clear in some situations which would have applied evolutionary pressure to the perception systems of early humans, that this error is best accepted in one direction rather than the other. Nozick acknowledges this: `some sacrifice in the total ratio of true beliefs would occur to achieve a higher ratio of important true beliefs’. (ibid., p. 284)

Threat Detection

Imagine a situation in which an individual glimpses an item through the trees. It might be a bear or a rock. The terms `bear’ and `rock’ are placeholders for two categories of items: dangerous and harmless. The following table illustrates the possible outcomes in a matrix across what the item actually is and what the observer believes it to be.

Photo by Robert Anthony Carbone on Pexels.com
CaseTrue SituationBeliefT/FValue
A)rockrockTgood
B)rockbearFbad
C)bearrockFvery bad
D)bearbearTvery good

The column headed `Value’ indicates the quality of the outcome for the observer. Note at once the strong asymmetry in the cost of error, where it exists. If A). the observer sees a rock and believes it to be a rock, this is a good outcome. No action is necessary. If B) the observer sees a rock but falsely believes it to be a bear this is a bad outcome but not a disastrous one. Perhaps the observer will needlessly take avoiding action and dissipate some energy.

Evolution Of Truth Tracking: False Negatives Are Expensive

The worst possible outcome C) is where the observer sees a bear but falsely believes it to be a rock. This is dangerous. It is the situation one would expect to be minimized in successfully reproducing individuals. The best outcome is D) where the observer sees a bear and identifies it as such. Then avoiding action can be taken.

Note that we are merely classifying the perceptual outcome: so D) is a good perceptual scenario despite the fact that the individual might in fact be better off overall if there were no bears around at all.

Case A) is fine for the subject but relatively uninteresting. Note that natural selection will act only weakly against belief formation mechanisms that are only somewhat positive or negative and will not act at all if they are not harmful or beneficial.

Thus a subject persistently misidentifying items within a category will not be selected out to the first approximation. However, this may still occur if it produces a second order effect resulting in a higher error rate in the dangerous category. An example of this is a subject performing sub-optimally on bear recognition via misidentifying trees as rocks in a scenario where bears frequently live in trees.

Dominance Hierarchies

A similar table below explains the emergence of dominance hierarchies in animals. (S H Vessey, Dominance among Rhesus Monkeys, Political Psychology, Vol. 5, No. 4 (Dec., 1984), pp. 623-628) These do not develop solely through combat; some combat avoidance is more optimal than for groups to lose members through constant deadly competition. This indicates that the previously given bear/rock example is not the only exemplar of this type of argument. The test proposition is p = `I will win this combat’; and C is true if the animal opts for combat.

CaseTrue SituationBeliefT/FOptionValue
A)ppTCgood — won
B)p¬pF¬Cbad — missed win
C)¬ppFCvery bad — lost
D)¬p¬pT¬Cgood — avoided loss

Animals avoid combats they believe they will lose. So they also err on the side of caution. This means they will accept some B) cases in order to avoid any C) cases. This is the worst and potentially fatal outcome wherein the animal falsely believes it will win: i.e. a Sensitivity failure. The argument is unaffected if animals do not in fact have beliefs but merely act as if they do, presumably by some heuristics which also will track the truth if they are optimal. The asymmetry results from the fact that the animal needs to win some combats, though not all, but must avoid all serious defeats. Animals relying more on Sensitivity are more likely to survive.

Outcome Asymmetry

There is a minor asymmetry between the two tables. The two success cases are described as `good’ and `very good’ in the first table, while both are simply `good’ in the second table. This disparity hides nothing major about the account. It is not committed to the strengths of the asymmetry as opposed to the existence thereof. It simply reflects that fact that avoidance of combat carries significance for the establishment of dominance hierarchies in animals.

Winning is certainly the best outcome, but avoiding a loss may not be too far distant a second best. However, in the case of the first table, it seems very clearly less valuable to identify a rock correctly than to identify a bear correctly because there is little purposeful interaction with rocks.

Evolution Of Truth Tracking: Error Asymmetry In Self Beliefs

Delusionally positive beliefs about the self can be beneficial if they relate to mental capacities; false beliefs about some physical capacities would be harmful. The table below shows the payoffs. The test proposition is p = `I am smarter than the rest’. In p, we can replace the term “smart” by any beneficial mental quality.

CaseTrue SituationBeliefT/FValue
A)ppTgood — accurate view of smartness
B)p¬pFbad — underestimation of smartness
C)¬ppFvery good — `bluffer’s bonus’
D)¬p¬pTbad — missed bluffer’s bonus

The term bluffer’s bonus is used to indicate situations of the type where poor students have an unrealistically high belief in their own prowess. Thus they do in fact perform better in an exam because they are not discouraged from working by an accurate perception of their likelihood of failure. Since exams did not take place in evolutionarily relevant times, we would need to consider positive mental qualities such as problem-solving ability. Persons with an unrealistically high assessment of their own skills might obtain the bluffer’s bonus by persisting longer with a difficult task.

Deception Can Be Useful

Alternatively, we could use some types of physical capacity. For example, “I can cross this difficult terrain without food or significant discomfort” while avoiding any risks in the area of unrealistic views of combat prowess. The bluffer’s bonus may additionally be adaptive because it aids deception of others in relation to one’s own capacities.

Evolution Of Truth Tracking: Challenges To Nozick

Bear/rock

We can paraphrase Sensitivity and Adherence as below.

  • Sensitivity: avoid too many positives i.e. minimize false positives
  • Adherence: obtain all the positives i.e. maximize true positives

If we take the scenarios in which a bear is present, then Adherence is more important than Sensitivity. We want to detect all the bears and can handle some false alarms. The situation is reversed for the scenarios in which a rock is present however. Then, Sensitivity is more important than Adherence. A failure of Sensitivity means that were a rock not to be present (and so a bear is present), the observer would fail to recognize the danger presented by the bear. A failure of Adherence means that the observer is needlessly scared by a rock. This may be slightly deleterious in that it could distract from more useful pursuits or lead to a propensity to react less when faced with an actual bear, but is certainly much less dangerous than failing to recognize the actual bear.

Evolution Of Truth Tracking: Sensitivity Is Indeed More Important Than Adherence

The Evolution Of Truth Tracking does not immediately seem to depend on the scenarios.

So far the bear/rock scenarios are neutral for Nozick. Sensitivity and Adherence both play important roles in the two possibilities. However, the question then becomes one of the prevalence of bears and rocks in the evolutionary environment, or the relative proportion of dangerous to harmless items. Since we are here, and also this is the case now, there is a strong presumption that in fact there were far fewer bears than rocks. Therefore in the situation as it obtains, and presumably has done throughout the evolutionary period, Sensitivity is more important than Adherence.

Sensitivity Does Not Do More Than Adherence

Nozick’s will use the response from the Introduction here. Sensitivity may indeed be more important than Adherence to the extent that it is more likely to produce knowledge, but this is not sufficient to delineate the transition from practical importance to relevance for epistemic assessment. We might have an argument that shows that one or other of Sensitivity or Adherence is more beneficial evolutionarily. This could perhaps show that we are more likely as evolved creatures to rely on Sensitivity or Adherence.

But a different sense of `importance’ is important in deciding whether s knows p. Even if in the actual world, one of the conditions was operative far more frequently, this could only show that knowledge is found in the actual world more frequently via one condition. However, the importance for Nozick would be outlining the shape of knowledge as it extends out into near and far possible worlds, or more precisely what shape our counterfactual beliefs would have to take in reasonably close possible worlds in order to qualify as knowledge in the actual world.

Evolution Of Truth Tracking: Omniscience And Truth

Omniscient beings would have beliefs that tracked the truth in all possible worlds. Fortunate mortals who happened merely to have true belief do not qualify as having knowledge because they would cease to hold the belief in very close possible worlds. We have not evolved omniscience, but on Nozick’s picture we have evolved to have beliefs that are somewhat robust out to some distance in the space of possible worlds.

Positive Items

Another possibility for Nozick would note that positive items have not been considered. He could claim that inclusion of these would alter the picture. We may term these `apples’: the category can stand for any useful or nourishing object. It seems here that neither Sensitivity nor Adherence will play a major role.

Failing to spot an actual apple will not harm anyone greatly providing plenty of apples are still detected. Misclassifying something else as an apple will similarly not cause gross harm as the error will be detected fairly shortly. In any case, the same argument is possible. Positive and negative items alike will be much less prevalent than neutral ones and so the inclusion of positive items likely does not alter Nozick’s position in either direction.

Bluffer’s Bonus

The most significant case is C) in which someone benefits from the Bluffer’s Bonus. This can result in substantial out-performance. In this case, Adherence has not been operative. The other case of false belief, B), involves a failure of Sensitivity. This outcome is unhelpful to the participants who are unrealistically pessimistic about their performance prospects, but since they are in fact highly capable, they will do well providing the pessimism remains within bounds.

Nozick can admit that in this set of cases, Adherence has played the more important role. He can then note analogously with the previous set of bear/rock cases that inverting p to `I am dumber than the rest’ will invert all of the value entries and Sensitivity will become more important. We would then need a way of assessing whether euphoria or depression have been more common in order to decide the relative practical importance and this seems a difficult task.

Ethics

The Evolution Of Truth Tracking looks more likely if it can explain the development of ethical behaviour.

In addition, Nozick may be able to rely on potential ethical implications of his account. This becomes relevant from an evolutionary perspective because there is some evidence that altruistic behavior is adaptive. This appears counterintuitive because altruism means giving up resources for no apparent gain. And yet altruism could evolve in groups — even non-interrelated ones — because of the benefits of mutual assistance.

After having developed his tracking account of knowledge, Nozick extends the tracking analysis into ethics (ibid., p.291 by noting that ethical behavior should track rightness. In this new context, Sensitivity is approximately `we should not do those things which are not right’ and Adherence by `we should do those things that are right’.

No clear answer can be given as to the correct picture of ethics, but it is clear that any version which only uses one of these principles would be very different to the duplex account. Sensitive but non-Adherent ethics would prohibit murder but permit negligent slaughter while Adherent but non-Sensitive ethics would mandate saving drowning babies but remain silent on murder. To the extent that those characterizations are correct and to the further extent that we have any evolved ethical behavior, Nozick has a further line tending to show that evolution favors retention of both conditions.

Evolution Of Truth Tracking: Robustness

If there is Evolution Of Truth Tracking then Sensitivity and Adherence should also work in close possible worlds.

A different line of argument can also appear to produce differential levels of importance for Sensitivity and Adherence, though this time leading in the opposite direction of giving priority to Adherence. This relies on Nozick’s observation that close possible non-actual worlds can influence the evolution of our belief-formation mechanisms. Without moving too far in the direction of Lewis’ modal realism, this can be understood as allowing for some flexibility to adapt to changing environments. Clearly creatures with excellent belief-formation mechanisms in the actual world will not be as fit as those which in addition, would retain such mechanisms in close possible worlds. Such creatures would be robust in changing environments.

If Nozick’s account of knowledge is correct, then this should mean that we have beliefs which are Sensitive and Adherent over some range of close possible worlds. There is some expenditure of evolutionary capital as it were to provide additional capacities and these do not produce compensatory payoffs in the actual world until it changes. This means that there will be a limit to the remoteness of possible worlds which can influence the development of belief-forming mechanisms. Intuitively, this just means that creatures with abilities to form correct beliefs about extremely implausible developments from their current environment will not be favored.

Evolution Of Truth Tracking: Distance To Other Possible Worlds

The asymmetry between Sensitivity and Adherence can now be seen. Adherence is tested in closer worlds than Sensitivity because p is true in the actual word. Intuitively, the two conditions combined require that beliefs (Sensitivity) switch off in worlds where not-p but (Adherence) not before then. Nozick’s response here will be to rely on the flexibility inherent in the possible worlds approach. He can insist that the objector provide an account of why the range of evolutionarily relevant worlds is smaller than the distance to the nearest not-p worlds.

Evolution Of Truth Tracking: Conclusion

It seems that Nozick is able to head off any evolutionarily-based challenges to his account of knowledge: we could have evolved to be Sensitive and Adherent.

See Also:

Are We Allowed To Follow Our Personal Aims? Nagel says Maybe

Links Between Schopenhauer And Apocalypse Now

Quine And Fine on Reference and Modality

Husserl’s Phenomenological Reduction: What Is It And Why Does Husserl Believe It To Be Necessary?

By Tim Short

I am a former investment banking and securitisation specialist, having spent nearly a decade on the trading floor of several international investment banks. Throughout my career, I worked closely with syndicate/traders in order to establish the types of paper which would trade well and gained significant and broad experience in financial markets.
Many people have trading experience similar to the above. What marks me out is what I did next. I decided to pursue my interest in philosophy at Doctoral level, specialising in the psychology of how we predict and explain the behaviour of others, and in particular, the errors or biases we are prone to in that process. I have used my experience to write The Psychology of Successful Trading. In this book, I combine the above experience and knowledge to show how biases can lead to inaccurate predictions of the behaviour of other market participants, and how remedying those biases can lead to better predictions and major profits. Learn more on the About Me page.