Your Beliefs Probably Aren't Yours (And That's Not Your Fault)
Discover why most of what you believe comes from others-and why you can't tell the difference. 20 years of practice, one uncomfortable truth.
I was in my mid-twenties when I asked my mother a simple question about one of her religious practices.
"Why do you do that?" I asked. "I don't remember reading it anywhere, and I've never heard the priest mention it."
Her answer came fast, the way answers do when someone has never needed to think about them: "Because that's how it's done. Stop asking questions."
I remember the moment clearly—not because of what she said, but because of what I felt. A small click of recognition. Wait. That's not a reason. That's just repetition dressed up as tradition.
What I didn't realize at the time was how many of my own beliefs worked exactly the same way. How much of what I confidently called "my philosophy" was just things I'd absorbed from people I trusted, repeated until they felt like my own thoughts.
It would take me another twenty years to do the audit. The results were humbling. Turns out I was better at spotting borrowed furniture in other people's apartments than in my own.
The Closet Full of Someone Else's Clothes
Here's a metaphor that helped me understand what was happening:
Imagine you moved into a furnished apartment twenty years ago. Over time, you stopped noticing that none of the furniture was yours. You arranged your life around someone else's couch, ate at someone else's table, slept in someone else's bed. Guests complimented your taste, and you said thank you—because by now, it felt like yours.
That's what borrowed beliefs are like. You wear them so long you forget they came from someone else's closet.
The realization hit me during a conversation about meditation techniques. Someone asked why I practiced a particular breathing method—the kind where you follow your breath for twenty minutes, supposedly activating parasympathetic response and reducing anxiety.
I started explaining. Confidently. With the ease of someone who'd done this for two decades.
I described the mechanism. I cited the benefits. I explained why this particular approach was superior to alternatives. The explanation was flawless. I'd heard it a hundred times.
Halfway through, I stopped.
"Wait," I said. "Have I actually tested this? Or am I just repeating what I was told?"
The answer was uncomfortable: I was repeating. Word for word, almost. Like I'd swallowed someone else's script and forgotten it wasn't my voice.
Had I actually done twenty minutes of breath-watching consistently? Had I noticed my anxiety go down? Had I compared it to other meditation types to see if this one worked better for me? (I've written about what actual testing looks like—it's messier than the explanations suggest.)
Not really. I'd read about it, talked about it, and believed I understood it. Twenty years of practice, and I was out here reciting paragraphs like a parrot who'd read about the Bhagavad Gita once.
Looking back, I genuinely wonder how I called myself a skeptic while running someone else's software.
Why We Do This (And Why It's Not Stupid)
Here's the thing: borrowing beliefs isn't a character flaw. It's a feature.
Most of the time, trusting experts is the smart play. You don't personally verify that antibiotics kill bacteria or that gravity works. Your doctor studied medicine so you don't have to. Your mechanic understands engines so you can just drive the car. Life is too short to test everything from scratch.
The problem starts when you apply this strategy to domains where it fails: meaning, values, spiritual practice, personal development. Because in these areas, what works for the expert might not work for you. Their path isn't your path. Their experience isn't your experience.
And outsourcing your thinking here doesn't save time—it wastes it. You end up practicing someone else's development instead of your own. This is what your guru forgot to tell you—verification isn't optional, even when they say "verify everything."
What makes this tricky is something psychologists call the "fluency heuristic"—when something feels familiar, the brain treats it as true.1 Repeat an idea enough times, hear it from enough respected sources, and it starts feeling like something you've always known. Like something you figured out yourself.
You didn't. But it doesn't feel that way from the inside.
The Seduction
Here's what I noticed once I started paying attention: borrowed beliefs feel exactly like genuine understanding.
When you've heard something explained well by someone you respect, when you've repeated it enough times that it flows smoothly, when everyone around you nods in agreement—the brain literally treats it as verified knowledge. The repetition activates the same neural pathways as direct experience. You can't tell the difference from the inside.
It feels like truth. Like something you've figured out.
Borrowed beliefs feel exactly like genuine understanding. That's what makes them dangerous.
But there's a difference between understanding an explanation and knowing from experience. Between articulating why something should work and actually having tested whether it does.
And here's what's seductive about it: borrowed certainty is easier than earned understanding. You get the social benefits—sounding informed, fitting in, avoiding the discomfort of uncertainty—without doing the hard work of verification.
You get to feel like you know things. Without the effort of actually finding out.
And if you're the type who built an identity on being a critical thinker? Let me tell you, that realization stings a bit.
The Tell
Here's how you know a belief isn't really yours:
You defend it reflexively, but you can't explain why from your own experience.
Someone questions the thing you believe. You feel a flash of defensiveness—an instinctive push back. Not a calm, rational "let me reconsider." A hot, sudden "no, you're wrong." Like you have to defend your identity, not your argument.
That heat. That immediate "no" before thinking. That's not conviction. That's attachment to being seen as someone who knows.
We're excellent at spotting this in other people, by the way. When they do it, we call it being defensive. When we do it ourselves, we call it having principles.
When you try to articulate why you believe it, you realize you're citing authorities. Repeating explanations you've heard. Gesturing toward logic that made sense when someone else presented it.
What you can't do is say: "Here's what I noticed when I tested this. Here's what happened when I tried the alternative. Here's why, based on my actual experience, this seems more accurate."
That gap between defensive feeling and experiential foundation? That's the tell. That's where borrowed beliefs live.
I noticed this constantly once I started looking. Someone would challenge an aspect of my practice. I'd feel the resistance flare up instantly. Not because I had strong evidence it worked, but because accepting the challenge would mean admitting I didn't actually know what I was doing.
The resistance wasn't protecting truth. It was protecting furniture. Protecting the image of someone who'd chosen wisely, when really I'd just been inheriting things.
The Audit
That night I did something I should have done years earlier. I started making a list.
Not of what I believed. Of what I believed and had actually tested. The difference matters more than it sounds.
The list was disturbingly short.
Most of what filled my head was inherited. Downloaded. Outsourced to authorities I respected so much that I'd stopped checking their work.
For each belief, I got curious. Not in a cold, analytical way—more like examining a bruise to see if it still hurts. Here's the framework I used:
- Origin Check — Where did this originally come from? A teacher? A book I respected? Someone in a position of authority who seemed certain?
- Experience Check — Have I actually tested this myself? Not "did I practice what I was told" but "did I experiment, compare, notice results?"
- Defense Check — Could I defend this from my own experience? If someone challenged me right now, could I say "Here's what I noticed" or would I just cite authorities?
- Identity Check — What would I actually lose if this belief turned out to be wrong? Am I protecting truth—or protecting identity? Because there's a big difference.
The process was humbling. Psychologists call this "belief perseverance"—we defend beliefs long after the evidence has shifted, because admitting we were wrong feels like a threat to who we are.2 Most of what I'd been confidently asserting turned out to be inherited—downloaded from sources I respected but never verified myself.
What Survived
Some beliefs held up under scrutiny. I tested them, noticed they worked, converted them from borrowed to earned.
One belief I verified was simple: paying attention to what's actually happening—versus what I think is happening—changes how I respond to stress. I sat with this for months. I noticed it. Compared it to times I didn't pay attention. Saw the pattern. The belief held up.
But here's what surprised me: the beliefs that survived verification were often simpler than the borrowed ones I'd been carrying.
The beliefs that survived verification were often simpler than the borrowed ones I'd been carrying.
The borrowed beliefs tended toward sophistication. Complex explanations. Multi-step processes. Systems and frameworks that required specialized vocabulary.
The verified beliefs tended toward: "This seems to work. I don't know exactly why, but I've tried it enough times to notice a pattern."
Less impressive to teach at a dinner party. More useful in actual life.
But they were mine. Built from direct experience, tested against alternatives, earned through actually paying attention.
And paradoxically, they were more useful. Because when circumstances changed, I could adapt them. I understood them from the inside. I wasn't dependent on the original authority to tell me what to do next—I had my own foundation to build from.
The Ongoing Work
I still borrow beliefs. Still trust authorities. Still take shortcuts—because life's too short to verify everything.
The difference now is I can feel the difference. I know when I'm speaking from something I've lived versus something I've repeated. And I'm faster at catching the moment when I start defending a position I haven't actually tested.
That catching feels like relief.
Not because I figured it out. But because the apartment is finally starting to feel like home.
For You
I think about my mother sometimes. She lived in a house decorated by tradition, rearranged by guilt, furnished by "that's how it's done." I'm not sure she ever realized she could have put her own furniture inside.
I realized it. Took me twenty years, but I realized it.
And now the apartment I'm living in? Some of the furniture is still borrowed—from books I trust, people I respect, frameworks that work. The difference is: I know it's borrowed. And I can move it around if it stops fitting.
So here's the question: How much of what you believe have you actually tested? And how much is just furniture from someone else's apartment?
And I know what you're thinking: "This guy's asking me to audit my beliefs? Sounds exhausting."
Yeah. It is.
But here's why it's worth it:
Every belief you verify becomes yours in a way borrowed beliefs never can.
You can modify it, teach it, adapt it. You're no longer dependent on the original source to tell you what comes next.
Worth checking. The audit is uncomfortable. But the apartment feels different when it's finally yours.
Continue Exploring
What Your Guru Forgot to Tell You — Why "verify everything" sounds good but almost nobody actually does it.
What Twenty Years of Meditation Actually Taught Me — The one lesson that changed everything: you are the foundation.
Furniture You Actually Chose
Weekly essays on examining what you believe—and why. No borrowed certainty. No secondhand wisdom presented as your own. Just what survives honest verification.
Every Tuesday at 8 AM EST. One email. Unsubscribe anytime.
Already subscribed? Share this with someone who might be living in a furnished apartment.
Notes & References
1 Reber, R., & Schwarz, N. (1999). "Effects of perceptual fluency on judgments of truth." Consciousness and Cognition, 8(3), 338-342.
2 Ross, L., Lepper, M.R., & Hubbard, M. (1975). "Perseverance in self-perception and social perception." Journal of Personality and Social Psychology, 32(5), 880-892.