This Story Is Eerily Familiar
Updated: May 31, 2021
The impact books have on our lives is not limited to the words written between the covers. Some books inspire new thoughts and send us to unexpected places. Follow me Down the Rabbit Hole in this recurring segment.
As you may recall from last week’s blog post, in her book Untamed, Glennon Doyle describes four keys to escaping her cage and becoming “untamed.”The first was Feel It All. This week, we’re going to start our tangent with the second key: Be Still and Know. Doyle talks about her “Knowing”, about sinking down into herself and finding what is true an beautiful, for waiting for a nudge in the right direction, as opposed to thinking through a problem. This knowing has many other names, such as inner-wisdom, inner-voice, or intuition.
What is intuition? What evolutionary advantage does it present? How does it compare to the truth? These questions brought me quite swiftly to the field of behavioural science. If fact, this field of study is so fascinating, it may even crop up next week.
Oxford Languages defines behavioural science as “the scientific study of human and animal behavior.” This is a vast subject. For our purposes we’re just going to touch on a few things that affect human behaviour and thinking.
Our brains take fascinating shortcuts to both conserve mental energy and make effortless decisions. The technical term for these shortcuts is heuristics. These features served us well throughout most of evolution, however today with globalization and the internet, leave us susceptible to manipulation. Heuristics can result in biases, such as the well-known confirmation bias, where one is more likely to notice and trust information that supports their current belief. Another bias that has received a lot of media attention recently is the illusory truth effect , wherein repetition of the same false information leads people to believe it to be true, even when they initially knew it to be false. Pair this with social media, and the lines between truth and fallacy start to blur.
If you’re like me, you don’t want to believe you can be manipulated or that you hold biased opinions, even though the word “opinion” itself implies bias. I like to believe that I’m a rational being who carefully weighs the facts before formulating an opinion, despite the fact that it’s impractical to put that much brain power into every decision. My tendency to be overconfident about the accuracy of my judgments has a name: the illusion of validity. Yes, that’s right, another bias.
Western cultures tend to hold up logic and reasoning as superior in comparison to emotion and intuition. Bias is associated with many negative images, including flawed science, prejudice, and nepotism. Consequently, bias can conjure up some unpleasant feelings, and any mention of it can generate an automatic response: bias is bad. Ironically, this may be the affect heuristic in action, whereby a decision is made by relying on emotions. This whole business seems hopeless. Perhaps what is needed is a paradigm shift.
Bias is not a four-letter word. I mean, it does technically have four letters, but you catch my drift. Let’s try and take an unbiased look at bias.
In his book Thinking, Fast and Slow, Nobel Prize winning psychologist Daniel Kahneman describes two distinct cognitive systems for decision-making. System 1 is fast, automatic, intuitive, and is more prone to error. System 2 is slow, deliberate, dominated by reasoning, and requires more mental energy. Called dual process theory, it is essentially a marriage between perception and reasoning. Most of our daily decisions and actions use fast thinking System 1, which makes sense from an evolutionary point of view, since it requires less energy.
System 1 is skilled—most of our actions require skill, like driving a car or carrying on a conversation. It’s probably one of the reasons experience is valued—the more time someone spends in deliberate, rational practice of a skill using System 2, presumably the more skilled their automatic System 1 becomes. Think about a chess master that seems to intuitively know the next best move. A difficult question is answered automatically with heuristics drawn from past events. Experience and common sense are examples of heuristics. In this case, biases serve us well—and most decisions are made using them.
The perceptions used in fast thinking are automatic, and therefore undetectable. This is how we exert our biases without knowing it. What kinds of things affect our fast thinking? Familiarity. Emotions. Repetition. Recent events. Observed association. Similarity. Prominence. Preexisting beliefs.
The automatic action of the fast-thinking decision process reminds me of the automatic process of neuroception, introduced last week in Shh... Your Feelings Are Trying To Tell You Something when we were talking about the Polyvagal Theory. Recall, during neuroception, threat or safety are evaluated automatically causing the nervous system to react accordingly. If threatened, the autonomic nervous system reacts with fight, flight, or immobilization, and focusses cognitive resources on the stressor. If safe, the nervous system supports social engagement, restoration, and learning. The similarity between polyvagal theory and System 1—namely, automatic perception in the absence of cognitive awareness—had me wondering if they are related.
As a lover of irony, I confess that the connections I’m seeing may merely be examples of illusory correlation. That said, this is what Down the Rabbit Hole is all about– following a stream of consciousness to see where it takes us.
Now, it just so happens that System 1 is a great storyteller—it can fabricate a story from the available information, no matter how scant. The story’s coherence is so convincing, it feels real and true. In this case, biases help us make sense of the world.
What about slow thinking system two? It is deliberate, rational, and uses a lot of energy. Essentially, if you’re focusing and concentrating, you’re using it. It generates doubt—another four-letter word—and thank goodness it does, because doubt motivates us to learn and grow. It’s energy intensive work! Mental energy can be depleted through, among other things, concentration, decision making, or emotional regulation, leaving little to fuel the others. In other words, you might get moody or unable to concentrate when you’re mentally exhausted. I know—not a remarkable revelation. Similarly, the polyvagal theory proposes that a sense of safety is required for focused learning. This makes sense—stress impairs our ability to focus and learn. In fact, chronic stress changes your brain, negatively affects working memory, and makes it hard to learn and remember. A couple effective ways to combat chronic stress is exercise and meditation. They, too, can change your brain, but for the better. From a decision-making perspective, if the energy intensive System 2 was being overworked, they could also give it a chance to recuperate.
I think I have enough pieces to put together a cohesive story about Doyle’s advice Be Still And Know. Here we go.
Once upon a time there was a woman that spent her life trying to live up to everyone else’s expectations. She strived for an unattainable notion of perfection, and told herself that if she just kept hustling, she could do it all. And she could—kind of. Through focused attention she built skills, and she took on more and more responsibility. With so many balls in the air she lived in constant fear of dropping one. The chronic stress of living in a state of fight-flight-immobilize hijacked her ability to concentrate on anything other than the threats immediately in front of her. Constant multitasking made things harder, not easier, as her concentration was repeatedly interrupted. Operating on autopilot, when she doubted herself, she didn’t have the mental resources to analyze the source of it. All her reason-based resources were dedicated to building skills to optimize multitasking. Instead, doubt added to her stress. Neither did she have energy for managing her emotions, so instead she repressed and numbed them, neglecting her well-being. She was trapped in a perpetual state of survival, without sufficient resources to breakout. She felt disconnected from her own body.
This was no way to live. One day she asked herself, what do I want? Yet, she was unable to hear her own voice through the noise. Seeking quiet, she started meditating. It felt safe. Her autonomic nervous system calmed, freeing her from fight-flight-immobilize, freeing her mind from focusing on her stressors, opening it to perceive everything else. Her brain rested and replenished its energy stores. It felt good.
She started meditating daily. Her brain started to change, recuperating from years of stress. One day in that safe space a thought arose, nudging her to think about something new. It was effortless and automatic. It felt like a revelation! The intuition of fast-thinking System 1 was finally free from the physiological grip of fight-flight-immobilize. The thought felt true and real. It reminded her of all the stories she’d heard about people hearing their inner-voice and becomes convinced this must be hers. She automatically trusted it because it felt true, it felt real, and it was hers.
This is all just a story, of course, even if some of you find it eerily familiar. The explanation woven through the story feels real and true to me. Sure, the advice to meditate to calm your nervous system might just be another way of delivering Doyle’s advice “sink beneath the swirling surf of words, fear, expectations, conditioning, and advice—and feel for the Knowing.” Except hers is more poetic, and probably why she’s a New York Times bestselling author while I’m a lowly blogger.
Still, unsurprisingly I like my explanation better. Mainly because I don’t like the idea of accepting opinions as facts, or of calling impressions truths. Humans have biases, it’s good that we do, providing we acknowledge their limitations and potential for error. For me, the language surrounding intuition is important. It seems all too easy for someone to jump from my truth to the truth, or from knowing to knowledge. For this reason, I much prefer the term belief over truth.
It’s hard enough these days to tease the truth out of all the information and misinformation out there without calling each individual’s opinion by the same name. Personally, I like to use fact checking websites like Snopes if I can. Barring that, there’s always the library, or even good old-fashioned thinking-it-out. Making sure, of course, to have the energy resources to do so. Perhaps I’ll now consider a little meditation before embarking on serious deep thinking. That feels reasonable.
How do your intuition and beliefs inform your decisions? Knowing that biases affect your intuition, what kind of focused learning might you pursue to challenge or confirm the beliefs felt by System 1? Do you have a different explanation of the advice Be Still And Know? We’d love to hear your story. Comment below, email us, or connect through the Book Interrupted Book Club Facebook group!