The Illusory Truth Effect is a variant of how we inaccurately use our feelings to make decisions. We use at least two methods to decide on the truth of a claim or the correctness of new information. The first method is somewhat allied to one of the philosophical account of knowledge: coherentism. We assess the claim based on whether it is consistent with what we think we already know. The second method is to consider how we feel about the claim or purported new information.
Both approaches have drawbacks. The first method, while probably the best available, can lead people into multiple errors. If you already believe something false, you are more likely to believe further false claims which are allied to the first false claim. We see many pernicious illustrations of this; for example, in political polarisation and various forms of prejudice.
The second approach is more damaging. In fact, deciding whether something is true or not based on how we feel about it looks so odd that you might wonder whether it can possibly be the case that this happens. This is another example of a puzzling psychological bias which in fact it makes sense for us to exhibit because, on average, it will produce an answer which is “good enough.”
One thing we don’t like is work. If we have seen a claim a lot before, we don’t need to work too hard to decide whether it is true again. (This can also be seen as a processing fluency effect.) We are comfortable with the claim or the apparent information. I don’t need to think about the route to walk to the gym because I have done it a lot before and it always worked. This familiarity effect or ease of processing effect is fine in relation to the route to the gym. And there are going to be a lot of daily questions like that where it would be inefficient to reevaluate them.
This is all fine. However, it turns out that we also do this with false claims which we have seen often. That of course is going to be a huge problem. The Illusory Truth Effect is also known as the Reiteration Effect for this reason. Basically, if I tell you something which is false a lot of times, you are likely to get comfortable with it and more likely to believe it.
This will have frequent damaging effects in financial markets. For example, in the case of the Bitcoin bubble, which I forecast approximately three days before the peak:
— there are I think some causal factors deriving from the Illusory Truth Effect, though as I discuss there, there are many other psychological biases and errors at work in the bitcoin bubble.
In particular, what we saw in the case of the Bitcoin bubble was the cult-like nature of the phenomenon. Proponents of the cryptocurrency repeated hundred of times the same false claims like “it can only go up;” “Bill Gates is enthusiastic about it;” or “all we have to do is HODL (sic) and everything will be fine.” Cult members believed all of this partly because they had heard it all many times and so they became familiar with it.
Turning to the professional sphere, we can expect that the Illusory Truth Effect will play a part in any bubble involving more than just the inexperienced investors who became infected in the Bitcoin epidemic. DotCom caught a lot of people (including myself, because I was young and inexperienced.). We heard many times that anything involving the internet was going to be a huge success. So we started to believe it.
There are many features of markets that are true until they aren’t. Try to avoid believing something merely because you have heard it a lot. Look for evidence.