I’m sure the title of this article automatically reincarnates the scene from A Few Good Men when Jack Nicholson’s character screams “You want the truth?! You can’t HANDLE the truth!”
Well, such is life.
Listen, I’ll give you the truth for a minute, here, and tell you that I’ve been trying this whole “honesty” thing out for a few years now, and to be HONEST, it has gotten me absolutely nowhere. Not with relationships, anyway.
No one wants to hear relationship truths right from the get. Apparently, we all want to wait YEARS to get to the truth of the matter, and then when we do, eventually find out the truth about one another, we are so, so disappointed, and thus the relationship ends if you can’t accept those truths.
Why do we need to know every detail about the car we’re buying, about the house we’re about to take out a 20 year mortgage on, have these things inspected, but we let relationships (the truthfully important thing in our lives) fall to the wayside? What are we afraid of?
Being honest with myself and others has helped me in one area, though. I fully condone telling lies if it’s going to save someone from being hurt. However, other than that, it’s really useless. Unless of course, you have a small child who asks an unanswerable question like “what happens after we die?” Then, you lie, or rather, make up a fairytale about all things good, or tell your honest-to-God belief about what happens after death, which isn’t a blatant lie, but an opinion.
My problem is this; people don’t want to hear the truth.
Our society is built on lies. It’s built on going against the truth about matters for many reasons, political gain, power, popularity, money. Of course, this leaves out the discussion of nature and science (which we’ll leave for another day).
A very controversial truth is that no one knows what happens after we die. No one knows if there is a God, no one KNOWS those things, and if they claim to, THEY ARE LYING. If we take it further, no one REALLY knows anything. The problem is that, when faced with these untruths in life, we can CHOOSE to believe them, or not.
Why then, do we believe lies more often than not? Why are lies about ourselves and about life and about others what we most often choose to believe? Society tells us that we need to look a certain way, buy certain products, that we NEED these things. We do not. It tells us that what we see on the TV screen is REAL! It’s not. It tells us the things we put into our bodies won’t harm us. They will.
It’s a lie that we can all look like models. Most of use never will, but is coming to terms with this inherent fact something we just cannot bear to accept? Why do we accept that our society goes against the very NATURE of humanity?
Humans are not monogamous, yet, we are supposed to strive for monogamy? We are supposed to fight and go against every instinctual thing about ourselves, for what? To alleviate suffering?
False beliefs are the very thing that CREATES suffering.
We believe these things that are told to us, thinking they are easier, but rather, are causing us undo suffering. We take YEARS to “come to terms” with our bodies, our minds, and our lives, which can be proven normal by any standard, but become depressed when we can’t fit in to the one-size-fits all molds of our society.
Society tells us we are nothing if we can’t look this way, don’t drive that car, don’t have this job, don’t eat that, don’t have kids, a marriage, and a house. We spend years driving ourselves into debt, starving ourselves, hours upon hours getting procedures, or dressing a certain way to fit a standard, that is in all practicality UNREALISTIC.
What if truth was popular? What are we afraid of? Acknowledging that we are selfish, not always in good health, and living a mediocre life? Why is that bad? Why can’t we handle the truth?
Stop letting everyone else lie to you, and more importantly, stop accepting those lies.