Stories are Abstractions to Generate Meaning

“There’s bullshitters and there’s liars. Difference is, the liar tries to hide his bullshit while the bullshitter lets you know he’s lying. That’s why I like bullshitters more than liars.” – Matthew McConaughey

In both personal and professional spheres, narratives – our abstractions about data and experience – are crucial for drawing insights, making meaning and fostering connection. They’re not automatically propaganda or manipulative tools; rather, they help us integrate raw, entropic information into coherent models of reality. Maintaining a sense of personal narrative strengthens our identity and relationships, as it gives both others and ourselves a clearer understanding of who we are. Without it, we often get lost in excessive data or suffer communication breakdowns.

While sharing or aligning with certain narratives can slip into tribalism in today’s political climate, it isn’t necessarily an inevitable consequence of generating stories. The key lies in how we hold these abstractions. In other words, it’s a skill issue. Healthy, adaptable storytelling can encourage openness instead of division. Holding strong narratives “lightly” allows them to evolve with changing circumstances and can unify groups. Boundaries ensure we don’t feel obligated to overshare or conform to an imposed image. Ultimately, narratives reduce confusion and promote connection when we remain flexible and mindful that these stories form part of what makes us human.

But there’s the rub: What does it mean to hold narratives lightly in real life wrought with complex narratives and noisy stream of data? Drawing on Karl Popper’s principle of falsifiability, scientific fields frame abstractions as testable hypotheses. In deductive reasoning, the relationship is clear: “I believe X because of Y” means that if Y is false, X must be rejected. But most real-world narratives operate inductively – “Evidence Y increases my confidence in X.” When Y is absent or false, we don’t completely reject X, but rather update our degree of belief. This is why unverifiable hypotheses cannot become part of established scientific truth – without clear criteria for success or failure, we can’t even begin this process of belief updating. Without observable evidence to test against, we can neither deduce truth with certainty nor inductively refine our confidence. This same principle applies to our personal narratives – they must remain testable against reality, even probabilistically, to avoid becoming mere dogma.

This tension between testability and belief becomes most acute when we consider our core convictions. Our personal narratives often begin as provisional models of reality – subject to updating based on evidence. But unlike scientific hypotheses, these narratives can gradually shift from the realm of testable claims to become load-bearing elements of our identity. What starts as ‘I believe X because I’ve observed Y’ transforms into ‘X is part of who I am.’ Like a frog in slowly heating water, we rarely notice this transformation until we find ourselves fully immersed in beliefs that have moved beyond the reach of evidence.

Reflect on how many of your deeply held beliefs are not open to direct falsification, yet still guide your actions and relationships. It’s an uncomfortable process to actively engage in. But the beliefs that are independent of evidence must have stayed in the realm of unfalsifiability long enough that they have become a part of you – an identity. Killing the idea no longer leaves the idea solely in the graveyard, but also (a part of) you with it.

Yet recognizing this intertwining of identity and belief doesn’t have to end in behavioral paralysis or cynicism (e.g. “nothing matters because all beliefs are relative”). Instead, it can be an invitation to examine why certain stories or convictions become central to our sense of self. Holding narratives lightly means staying curious about them—asking, “Where did this come from?” and “Does this abstraction accurately model reality?” while avoiding the anxiety that we must either discard our cherished beliefs entirely or cling to them dogmatically. Curiosity offers a partial cure to paralysis and cynicism.

When we’re curious and if curiosity is the axiom of the hybrid-rational agent, discarding unfalsifiable beliefs becomes more systematic.

A Case Study on Truth-seeking in Dynamical Group Environment

We can understand this through three archetypal agents:

Agent C: The truth-seeker

  • Updates beliefs based on evidence
  • Only supports claims with actual evidence
  • Withdraws support when counter-evidence emerges
  • Follows both deductive and inductive logic properly

Agent B: The bullshitter (if open)/liar (if hiding)

  • Has seen counter-evidence
  • Still supports the original claim
  • The key difference is whether they acknowledge their epistemic compromise

Agent A: The pure liar

  • Knows the foundational evidence doesn’t exist
  • Claims it does anyway
  • Direct contradiction of known truth

This maps cleanly to real-world dynamics: the scientific method aspires to Agent C behavior, political discourse often operates at Agent B level, and propaganda works at Agent A level. The tragic part is that institutions and systems often pressure Agent C types to behave like Agent B or A, leading to organizational rot. This pressure to abandon proper epistemic discipline is what makes maintaining Agent C behavior so difficult yet vital.

A skeptic might contend, “You may be pressured into situations where accepting an unfalsifiable narrative correlates with survival.” That is also true, but there’s a crucial distinction in how we handle such situations. As Matthew McConaughey observed, “There’s bullshitters and there’s liars. Difference is, the liar tries to hide his bullshit while the bullshitter lets you know he’s lying.” When circumstances force us to engage with unfalsifiable narratives, signaling our provisional relationship to them – akin to “bullshitting” – preserves more intellectual honesty than pretending full belief. Concealing this tragic dynamic rots organizations and nations and makes remedy less probable. Perhaps many people fall into this category.

Agent D: The deceived truth-seeker

  • Follows truth-seeking principles of Agent C
  • But operates on carefully curated/biased information
  • Genuinely believes and finds evidence for {reason}
  • Unknowingly perpetuates false narratives while thinking they’re practicing good epistemic discipline

Perhaps I was a bit too abstract. This document is a living document. As you come across new scenarios or insights on how personal narratives evolve, feel free to add them to this growing list.

Tags: