A final concept that is a critical component for becoming a better critical thinker is adopting a stance of epistemic humility. As we have already seen, our thinking can be clouded by cognitive biases. Additionally, our perspective on the world is always colored by our own experience and rooted in the particular place and time in which we live. Finally, even our best scientific knowledge of the universe explains only a fraction of it, and perhaps even less of our own experience. As a result, we should recognize these limitations of human knowledge and rein in our epistemic confidence. We should recognize that the knowledge we do possess is fragile, historical, and conditioned by a number of social and biological processes.
Question Yourself: Do I Really Know What I Think I Know?
We retain all sorts of beliefs from many different sources: memory, testimony, sense perception, and imagination. Some of these sources may be reliable, while others may not. Often, however, we forget the source of our beliefs and claim to “know” something simply because we have believed it for a long time. We may become very confident in believing something that never happened or did not happen in the way we remember it. In other cases, we may have been told something repeatedly, but the ultimate source of that information was unreliable. For instance, most people recommend wearing warm clothes outside when the temperature drops so that they do not “catch a cold.” This is the sort of wisdom that may have been passed down through generations, but it makes little sense from a medical standpoint. There are not many ways that getting a chill or even lowering the body temperature will lead to a respiratory infection. Colds are caused by viruses, not by a drop in temperature. Without thinking through the source of the belief that “if you get cold, you may catch a cold,” you end up believing something that is not true.
Be Aware of the Dunning-Kruger Effect
An even more pernicious form of epistemic overconfidence is revealed in the psychological phenomenon known as the Dunning-Kruger effect. David Dunning and Justin Kruger demonstrated a widespread illusion in which incompetent people or novices rate their own knowledge of a subject more highly than they ought to, while highly competent people or experts rate their knowledge slightly lower than they ought to. These findings do not mean that the experts considered themselves to be less competent than novices. In fact, experts are fairly accurate in rating their own knowledge. However, they tend to assume that everyone else has a similar level of expertise. By contrast, novices consider themselves to be far more competent in comparison to others and misrepresent their own incompetence, which can be a dangerous in many situations.
The lesson from the Dunning-Kruger effect is that you should be extremely wary when assessing your expertise about anything, but especially about something that is a new area of learning for you. The reality is that your intuitive sense of your own knowledge is likely to be inaccurate. It takes time to build expertise in a subject area, and the expert is more capable of assessing their own knowledge accurately.
The content of this course has been taken from the free Philosophy textbook by Openstax