People and, in turn, our world are dominated by intuition. We listen to music and watch films that bring out certain abstract feelings in us. We spend time with people we click with for somewhat inexplicable reasons. We dedicate our careers to what feels right to us and what we care about.
Intuition makes up the most human element of all of us, allowing us to experience things and express ourselves in distinctive ways. When you spend time in ways that just feel right or are intuitive to you, you can’t help but be fulfilled since there’s no deliberation necessary in these decisions. You just know they’re right.
But, of course, intuition can be wildly inaccurate. Someone who based on the suaveness and pedigrees of the partners of their prestigious firm may feel in their gut that they must be doing important things for society might not actually have been doing anything that would not have happened without them. After all, in the field of investment management, a blindfolded monkey is better at picking stocks than investment professionals.
In the book Thinking, Fast and Slow, psychologist Daniel Kahneman categorizes human thought as System 1 and System 2. System 1 is what I’ve been referring to as intuition and what you might use to assume that a new colleague is lazy and disorganized when they walk in late to a meeting looking disheveled even if you don’t really know why they were late. This system of thinking is fast, emotional, and requires little effort. System 2 is slower, more logical, and requires more effort. You’d probably need to use this system to solve a tough calculus problem.
The book summarized decades of research suggesting that we have too much confidence in our System 1 and that we often make dumb decisions because of it. One such case that has stuck with me since I learned about it is the sunk cost fallacy, which is our instinct to follow through on something if we have already invested time, effort, or money into it. For example, someone might think they need to become a physics professor just because they spent the last five years earning their physics PhD when they’d honestly be a lot happier being a comic book writer. This is a fallacy because you can’t recoup what you’ve invested, so it doesn’t make sense for that to be the main factor in your decision-making. You should simply do what is best for you now.
Struck by how much I really can’t trust my own judgement and maybe also generally not having strong intuition one way or another for how to live my life, I deferred a lot to works from academics and researchers on things I care about since I assume they’d spent a lot more time using their System 2 to study these topics than I have. But, after encountering some researchers that seem too certain about their conclusions, my intuition (hah) told me to be doubtful. Could these researchers have cognitive biases of their own that impact how they experiment and interpret their results? What about the fact that the very act of observing phenomena changes how they behave?
In Episode 142 of the Clearer Thinking podcast, journalist Christie Aschwanden elegantly explains the limitations of research, including the following:
- In certain fields like athletic performance, it is very common to have studies with extremely small sample sizes that produce unreliable results.
- If you have the same hypothesis but use different methods to test it out, you will get different conclusions.
- Studies may test out something a lot more specific than what the study title or conclusions suggest. For example, a study that concluded that those who study abroad are more creative may actually just have found that those who studied abroad in a New England liberal arts college paint in their free time.
- Studies are susceptible to the McNamara fallacy, which is the belief that anything that cannot be measured is not important.
- Although science progresses with studies, to get some sort of established truth is a much slower process.
Although we should take all studies with a grain of salt, it is obvious that many have had enormous positive impacts on the world, so it would be remiss to not pay attention to them at all. But, as Aschwanden explains, getting to truth, if possible at all, can be very slow. So, unless the research in the area we care about has been validated many times, we really shouldn’t be stressed about not following the “best science.” We should instead be skeptical of the first conclusions we see about a topic, be open-minded about the possibility of other conclusions, and perhaps even do some experimenting ourselves.
Also, as Aschwanden wisely states, we invest a lot of time, effort, and resources into life optimizing strategies with very marginal gains when all you really need to do is to exercise, eat a balanced diet, get enough sleep, manage the stress in your life, and get some social interaction.
But, of course, beyond basic needs, everyone needs different things to be fulfilled that they may want to optimize. Something to consider in the attempt to optimize is what would be sustainable over the long run. When I was on my middle school cross country team, I tried to “optimize” my performance in the meets by running as fast I could in each one. What resulted was an utter hatred for running and me not running long distances ever again for another 11 years. What finally broke the ban was a friend of mine who runs marathons saying that you should only run at the speed at which you can comfortably talk to others, which was mind-blowing to me. Now, I can actually look forward to and enjoy running long distances because I don’t hate my life when I’m doing it. These are things that your intuition or perhaps just basic understanding of your physiological capacity can be really helpful for.
As a final food for thought, Napoleon defined a “military genius” as “a man who can do the average thing when all those around him are going crazy.”