Ricochet is the best place on the internet to discuss the issues of the day, either through commenting on posts or writing your own for our active and dynamic community in a fully moderated environment. In addition, the Ricochet Audio Network offers over 50 original podcasts with new episodes released every day.
To Susan: Stones, Sticks, and Snakes
I suppose many of us who are tired of intellectuals’ claptrap are either already fond of the following anecdote, or will be fond of it once we’ve read it:

Real, lived experience should always trump fancy mental models, amirite? Pity, then, that our real, lived experience is fancy mental models. What we experience isn’t raw data, but something already heavily processed by our unconscious mental models before we’re even aware of it. We can compare the tale of the stone to the tale of the stick and the snake:
Yes. It’s a TEDx excerpt. Snicker if you want. But the anecdote takes about five minutes. A jolly Australian walks through the bush. He feels a stick scratch his leg. Then he nearly dies. Six months later, he’s walking through the bush again, he feels a venomous snake bite his leg, and yet he’s completely fine. Why?
Because raw sensation isn’t.
During the first walk, before our jolly Australian is even aware of it, his brain decides to ignore the irritated nerve endings frantically signalling to it that they’ve been envenomated in favor of the (completely unconscious) hypothesis that he’s likely just scratched his leg on a stick. During the second walk, his brain, remembering what the snake bite felt like (a stick scratching his leg), receives a similar sensation (really getting his leg scratched by a stick this time) and goes nuts: Help! Snake! We’re dying!
His brain’s second interpretation results in excruciating pain, and not because the guy’s just a wimp. (Yeah, yeah, I hear some of you mutter, “He gives TEDx talks – how could he not be a wimp?”) Whether we’re wimpy or tough, even a “raw experience” as primal as pain is not raw at all but an experience constructed by our brain’s subconscious models, whether we like it or not.
From a biological perspective, our brains don’t have to model reality correctly. They only have to model reality well enough for us to survive. There’d be no reason to suppose we could survive without perceptions “good enough” at perceiving what’s really there, but if an untruthful heuristic was close enough to the truth to help keep us alive in our ancestral environment, we can’t blame our brains for using that heuristic today – and entirely without our permission, too! Even so, we’re spirited beings. We can insist on being more than our biology, on using our brains for inquiry rather than just survival. Moreover, our heuristics may not be as bad as they’re sometimes painted:
Some researchers of cognitive bias, like Kahneman and Tversky, disbelieve that humans reason in a Bayesian way. Others, like ET Jaynes, point out that the problem with flesh-and-blood humans is that they’re capable of much more sophisticated Bayesian reasoning than researchers may have in mind, and that researchers’ failure to see this and account for the fact that their subjects may not share all the same information or pursue the same goals as the researchers explains many instances of supposed “cognitive bias”. (p 162)
When @susanquinn said in Bridging the Abyss, “The first reality to acknowledge is there’s no objective reality,” she, as @madpoet said, got our hackles up. None of us would actually throw stones at Susan and ask, “Feel real enough to you?” but we might be tempted to invite her to find a stone to kick while we asked her the same question. Susan said of her own assertion, “Yes, I see the conflict already in my statement.” Later on in the comments she agreed that objective reality exists, though our knowledge of it does not. But because our knowledge of reality (whether it comes from exalted cogitations or mundane experience) is … our only knowledge of reality – the only experience of reality we have – it’s not surprising that we often treat our knowledge as if it were reality itself. Being human means being prone to the mind projection fallacy:
The mind projection fallacy is a logical fallacy first described by physicist and Bayesian philosopher E.T. Jaynes. It occurs when someone thinks that the way they see the world reflects the way the world really is, going as far as assuming the real existence of imagined objects.[1] That is, someone’s subjective judgments are “projected” to be inherent properties of an object, rather than being related to personal perception. One consequence is that others may be assumed to share the same perception, or that they are irrational or misinformed if they do not.
To make things worse, even if we were perfect little Coxbots (inference machines capable of perfectly reasoning even in the face of uncertainty), we could, given sufficiently different prior information, still disagree with one another. In a case like this, otherwise-perfect reasoning, if marred by the mind projection fallacy, would have us warring over differing “realities” (perceptions of reality) even though no “reality” (perception of reality) was wrong.
Susan’s quip, “The first reality to acknowledge is there’s no objective reality,” is thus a punchy exaggeration acknowledging how humans really behave – acknowledging that all our knowledge, including the realest of our “raw experience” (which our brain already cooks the book on), is subjective; that we’re so good at the mind projection fallacy we reflexively believe our perceptions are reality; and that we consequently find it easy to demand that “Everyone has to share our perceptions and agree with them.”
Reflexively supposing our perceptions are reality is not wrong. Often, this supposition is quite sensible and very necessary: when a truck’s about to run you over is not the time to doubt your perception of the truck if you want to live! It’s not wrong, but it is incomplete, a reflex we must sometimes override in order to be more realistic – more honest, more truthful.
It’s a reflex we find ourselves overriding less, though, when we’re among those whose perceptions overlap with ours, those with whom we share a history of common experience. That’s one reason to prefer “our own kind”: second-guessing our trust in our own perceptions is distressing, especially when we already feel threatened, because it means second-guessing a reflex that not only simplifies life, but safeguards it.
Much of what’s called “being healthy” amounts to having perceptions that are socially normal. When we’re “healthy,” our perceptions – our five senses, our pain-sense, even our moral sense – seem to work much as others’ seem to work. It’s not surprising, then, when conservatives assert a “healthy society” is one that sets standards declaring which perceptions are normal and therefore “healthy”: without those standards, how would we even tell what “healthy” is? Everyone second-guessing everyone’s perceptions is paralyzing. If we can agree to a standard of normalcy for “enough” perceptions, on the other hand, we can get on with life.
When we agree to that, we’re agreeing, not to avoid the mind projection fallacy (since that’s impossible in daily life: none of us has enough time or energy to always avoid it), but to share a mind projection fallacy, to agree that the world really is a certain way just because we all agree to see it the same way.
No wonder tensions can run high when we discover not all of us do see something the same way. No wonder we start wondering whether those disagreeing with us are “sick,” “a menace to their own kind,” or “not even interested in our survival.” Second-guessing our own individual perceptions is distressing enough. Second-guessing the perceptions we thought we had in common is even worse. At that point, it’s tempting to throw stones at the “sickos,” demanding, “Feel real enough to you, punk?” But then, the stick and the snake felt real enough, too, despite each being an illusion of the other.
Think about how intoxicated you have to be to be a menace behind the wheel. That level of distortion in perceptions is enough to be a problem, even though it’s nowhere near full-on hallucination. Any number of things, not all of which are recreational drugs, can result in about that much impaired perception.
The impairment in our perception doesn’t have to be really, really obvious in order to make a difference.
TL;DR but Lorimer and the NOI Group are doing some really good work in chronic pain.
Yes – when you’re driving is a classic example of a time when accuracy in perceptions that might be inconsequential otherwise becomes really important.
I was driving one night, exhausted, with a migraine. Migraines can make lights seem painfully bright. This huge truck with lights so bright even a glimpse of them in the side mirror almost blinded me was tailgating me.
Turns out the huge truck was a police car: I didn’t have my headlights on and failed to notice because of the migraine. Yes, the road ahead of me seemed strangely dim, but that seemed attributable to the migraine dimming out anything not as blindingly bright as the lights behind me. Fortunately, I was let go with a warning. I’m compensating, of course, by checking to make sure the headlights are on at night even when it looks to me like they are. But now I know I can’t just trust my perceptions of what I see through the windshield to tell me that they’re on.
Btw I don’t think it’s correct that Berkeley was denying objective reality. Even if correct, it’s a bit misleading.
To those who care I recommend my cartoon version of Berkeley’s book–previous page of comments.
Of necessity everything the brain detects about “reality” is a model of that objective reality. It cannot duplicate it.
The requirements for survival of the organism include a model complete enough to detect the vast majority severe threats to the survival of the organism, do so in a short enough time to make a response possible, and direct the response.
The fact, billions of humans are alive today, is reasonably good empirical evidence that for our organism, this modeling of objective reality is very successful.
As a model, it of course sometimes fails, but as long as it succeeds in a large fraction of cases, it is doing its required job.
As a systems engineer, I am always amazed and stand in awe of the capability of the human mind. We make feeble attempts to ensure our systems, as built, have only a small fraction of the abilities inherent in the human mind.
We design systems to react to external threats through planned failure, where by suitable design, deliberate weak points are included that will fail in a predictable fashion, converting death into minor injury or inconvenience. As an everyday example think about air-bags and crumple zones on your car. Either technique serves to limit the forces applied in a crash to you or your passengers riding in the car, making the otherwise unsurvivable, survivable. We learned this approach, at least partly by studying human and animal behavior. Think about throwing up your arms during a fall to protect your head or face.
Our designs, while improving radically over time remain poor copies of the god created abilities we give thanks for having.
We should share the same “projections” to the extent they are true.
Some shared cultural “projections” are merely conventional, but none-the-less useful — indeed imperative, such as driving on the right side of the road. It matters more that we all follow the same convention than what the convention is.
Of course there’s a third category of cultural practices that are neither true nor imperative but merely optional. The trick is distinguishing the three cases.
The funny thing is that while it’s models all the way down, nevertheless at the bottom is the object of our perceptions.
being healthy consists, in part, in having perceptions that tend to be true, not socially acceptable.
Assuming others’ sense perceptions are healthy.
The conservative position is that the standard for judging which perceptions are healthy is that they tend toward the truth.
We tell what healthy is through reason, not by reference to arbitrary standards. Sure, we compare our perceptions to social standards, but those standards are themselves subject to being judged against the truth at the bar of reason.
Not at all. I absolutely do not agree to any kind of fallacy per se. The only standards of normalcy of perception we should accept are those that tend to conform to the truth. That’s the opposite of a fallacy.
Do you treat cause-and-effect models as if they were real?
How good is your brain at recognizing that pain and tissue damage are different? If it’s a normal brain, it’s not good at recognizing this – arguably, healthy people should not have brains that recognize the difference.
Yes, causality is real. I will go further and say that causal efficacy is a nessecary condition of reality. Nothing is real that doesn’t partake in a causal relation.
Pain doesn’t have that cognitive content. Pain has normative content. Pain is a preliminary judgment that something ought to be avoided. It’s nearly always correct in my case, although I recognize that disorders exist that reduce its reliability in some people.
But even if perceptual judgements were much less reliable than they are, that would not justify intentionally accepting a fallacy.
People do recognize the difference between pain and tissue damage, as they should insofar as they are different.
I take it as axiomatic that one ought to believe the truth and not falsehood. Any line of philisophical musing that leads you to think it’s healthy to belive untrue things has gone badly wrong somewhere.
You treat cause and effect models as if they were real?
It is, of course, efficient, if you have what seems to be a working model of cause and effect, to treat it as real, rather than as a model. To treat it as real rather than a model is the mind-projection fallacy.
What do you mean by “pain doesn’t have cognitive content”?
Yes.
Hence my observation that healthy people can treat their pain as being “correct” (because it is nearly always correct, and that’s good enough for most people and purposes), while having a condition that reduces its reliability is having a “disorder” – not being healthy.
Except, I’m not really talking about being healthy. I’m talking about “being healthy” – that is, being in tune with a social agreement on what constitutes being a “healthy” person in a “healthy” society.
If, in your social circle, everyone’s perceptions but yours were distorted in the same way, wouldn’t it be likely that your perception would be treated as the sick one? I’m not asking, is it right (I agree with you it isn’t), but is it likely?
How sure are you of this? Not of what you think, or what conservatives ought to think, but of what they really do think.
How much do we want truth rather than unity, strength, or civilizational confidence? Are we not willing to sacrifice a little truth to pursue other goods? What about those conservatives who explicitly say we should defend our culture not because it’s right, but simply because it’s ours – and therefore should be defended even if it could be wrong, just because it’s ours?
Yes. To the extent the models are true, they are real. Obviously, models can be false and those models aren’t real, but the true ones are real.
To put it another way, reality consists of the same sort of thing that you are calling a model. A true model doesn’t just represent reality: it is (part of) reality.
My position is that what you are calling “mind projection” is just understanding and that it is not fallacious at all (if the understanding is true).
No true Scotsman or conservative would think otherwise.
We are not. We pursue strength etc. by pursuing truth.
Does anyone in fact hold that position?
If the social consensus on what constitutes health is incorrect, it should be challenged, not accepted for the sake of social cohesion.