Dealing With Deceit

There are no two ways around it: at some point in your life, you have been deceived, and you will be deceived again. There is something, right now, that you hold in your heart as true, and it is not. Lies and deceit are a part of the human condition that cannot be avoided; they grow like weeds, and they spread like disease.

There are two broad categories of deception, as I see it: deliberate and mistaken. There is some overlap, inasmuch as one may engender the other, but the heart of how I define them is that deliberate deception is when a person means to deceive, and mistaken deception is when the deceiver believes they are telling the truth but are in error. Once the deceit is passed along, a deliberate deceit can become a mistaken deceit when the person who hears it believes it, and repeats it as true. But there have always been those who know better and pass it along anyway, so that it remains, in that case, deliberate. You could, I suppose, break it down further to malicious deceit, where the purpose is to obscure the truth to promote an agenda of the teller, and well-meaning deceit, where the purpose is to shield the hearer from an unpleasant truth “for their own good.” I have some quibbles with that distinction. I don’t think deliberate deceit is ever a good thing, and this includes “little white lies.” If you can’t tell someone the truth without hurting them, don’t tell them anything until they are ready for the truth. Furthermore, that kind of attitude is a slippery slope into a bottomless pit – where does it stop? It doesn’t take very long for a single lie to become a plethora of them, and then they get repeated, and soon no one knows what’s true and what isn’t on that subject. Just try researching something controversial on the Internet if you don’t know what I mean by that.

Let enough deceit proliferate, and it can dominate a culture, and I doubt there has been a culture in the history of humanity where this has not eventually happened. Once a culture is dominated by deceit, the foundations of trust become rotten. When trust is gone, because those foundations have crumbled, so is the basis for authority. Brutal repression is the result of authority without trust, and eventually cultures of repression collapse in revolt, or when the center can no longer hold against outside forces because too much of its resources are spent in the repression. And the sad result, as any student of history can tell you, is that the new culture that arises from the ashes of the last learns nothing, and does the same.

But what is it about human nature that makes deceit so easily take root? I personally believe there are spiritual reasons at the heart of the matter, but even utterly secular psychologists recognize that there are some very concrete and definable limitations to how the human mind works that make it possible. Even a cursory web search of the term “cognitive bias” will provide you with lots of fascinating reading on the topic. Many of the recognized biases provide a genuine advantage in circumstances when those around you are trustworthy. It’s the deepest irony that they also become a horrible liability when those around you are not.

One strong example of this is known as the “bandwagon effect.” Have you ever sat in a group of sports fans and found yourself cheering for a team you normally didn’t care about? Perhaps a die-hard fan would never do such a thing, but I am not a sports fan of any sort. I enjoy watching a good game now and them, but I don’t have any teams of my own. Yet if I’m at a Superbowl party, I’ll root for whoever my host and other friends are rooting for. This is a large part of what gives politicians “momentum,” and can become a driving force behind rallies and protests. It is a defense against error, in that if an outsider tries to foist a deception on you, those around you will reject it and protect you from that error. But what if your group is in fact the party in error? It’s easy to see how you may tend to accept the falsehood because those around you do.

Another is an effect called “confirmation bias.” The effect is described as a tendency to be more accepting of opinions, statements, and expressions that support something you already agree with. You only have to spend two minutes on Facebook to see that one in action. This is what drives conspiracy theories and resistance movements. If your ideas are accurate in the first place, this is a good defense against error. But if you are already mistaken or deceived, it traps you in that deceit, and you will never rise out of it until you see what you have done to yourself.

I’m not sure if there is a name for it, but there is a related effect where people have a tendency to agree with, or support a person who expresses one concept they already accept as true. It doesn’t matter if everything else they say, or even if only some of what they say, is wrong or mistaken. They accept it all. Evangelical Christians seem to be particularly vulnerable to this: if a politician says he is a born-again Christian, they favor him (or her); if a teacher espouses a doctrine they agree with, they look favorably on everything else he says as well. It can be a vicious trap if the person they support is out to deceive, or is mistaken themselves. I fell into this trap myself many years ago when I heard a politician give a very good testimony of his Christian beliefs. I voted for him, and he turned out to be one of the worst to hold that office in my memory. That person was more than likely honest in the testimony he gave, but I accepted other things about him on that basis that just weren’t right. And today, I think many candidates just find out what they think Christians want to hear and repeat it so they can get their votes. That’s a deliberate deception.

There is a host of other biases, and commonly accepted logical fallacies (another great term to look up) that add to the problem, but you could write a book on those alone, which is why I only offer examples of a few of them. But if no one ever thought to deceive, most of them would not be problems at all.

So then what leads people to deliberately deceive? I think it’s simple enough. It gives them an advantage they do not think they would have if they were being honest. Perhaps they will gain power or wealth (though I tend to believe wealth is simply another type of power); perhaps it will get them out of trouble, or help them to avoid an uncomfortable situation. Invariably, it arises from weakness. A strong person, in a position of advantage, does not have to lie or deceive to gain anything. Someone who is comfortable with themselves does not mind if another doesn’t like their opinion, and doesn’t have to lie about it. In other words, the deceiver doesn’t feel like they have the ability to win in a fair fight, so they must cheat. Someone who feels like they don’t need an unfair advantage are perfectly fine with the truth. In today’s social media / Internet climate, a driving force is advertisement revenue. Far too many people latch on to things they see online that are misrepresented, out of context, spun out of proportion, or outright falsified. They click on links, and the parent web site gets additional traffic that they can show to advertisers and get better placement and money for ads. Because a lot of this happens behind the scenes, people simply aren’t aware of it &emdash; they think it’s just an exchange of new, information, or opinion. But it’s nothing of the sort. It’s deliberate deception to gain a financial advantage they wouldn’t have if they were truthful.

Which leads to the inevitable question: if we are so surrounded by unscrupulous and dishonest people, how can we keep from being deceived?

First off, you have to learn to recognize your own weaknesses and be aware of them. Look up “cognitive biases,” and “logical fallacies.” Determine which ones you are vulnerable to, and rethink your opinions. This is an extraordinarily difficult thing to do, simple enough as it sounds. It can be a lifelong process of self-revelation. But you need to start somewhere, and if you have determined in advance that you can’t be mistaken about something, you have already lost. This is a particularly hard thing for people of faith to do, because they feel that challenging one thing they have already accepted is tantamount to challenging the very basis of everything the have ever believed in. That, in itself, is a logical fallacy. You can be wrong about one thing in a particular sphere of knowledge, and right about others. For those who may struggle with that, I offer this insight: true faith has nothing to do with what you believe or how you feel, and everything to do with the object of your faith. You do yourself no favors if your faith is misplaced; and if what you believe is really true, then examining it more closely won’t hurt anything, and can only strengthen it. But you owe it to yourself to be sure that the object of your faith is, in fact, worthy of the trust you have placed in it. Else, you have just become another victim of deceit.

In today’s world of misinformation, there are some other skills that are needed. One very important one touches upon what I have said before: never accept that everything you think you know is fully and completely correct. You invite deception when you do that, and make yourself more vulnerable to it. It is not an expression of strength to steadfastly deny the possibility that you could be wrong. It is a weakness. A strong person can question and affirm themselves. A weak person can’t even consider they might be wrong. Going back to what I just said about faith, consider the fact that you can only confirm your knowledge as you discover more truth … and if what you uncover denies what you have previously accepted as true, you have done yourself a great service by eliminating a matter of self-deception. A corollary of this premise is that even if the object of your confidence is 100% reliable, it doesn’t mean your understanding of it is 100% correct. A good scientist understands this, because they constantly revise theories based on new observations and understandings (and let’s leave off the discussion about bad science for another day, it certainly exists). But people have a rough time doing the same with personal beliefs. I want to pull my hair out every time a Christian says, “the Bible says such-and-such, and I believe it, and that’s enough for me.” There is even a popular Christian song based on that drivel. Just because the Bible is true, it doesn’t mean your understanding of what the Bible says is correct. You have to be willing to look harder and deeper all the time, or you are going to lock yourself into falsehood by your inability to question yourself. I cannot emphasize enough, however, that once again, this is a lifelong process. I call it growth; and if you stop growing, you are already dead.

Another vital skill is the ability to question what you hear. It is remarkable how few people actually do. They trust the news, they trust their peer groups, they trust what’s published online or in print, and not once do they stop to consider that the source, or the source’s source, may not be trustworthy. Or, they fail to recognize their understanding of what the heard or read is mistaken. This is epidemic online. I regularly see an outrageous statement online, and when I look into it, find everyone is quoting the same source, and the source had an obvious bias or agenda. I have seen people repeat things as fact that, when researched, I find the original statement was satirical. I have researched things repeated as fact and discovered the source completely made it up … yet it was repeated because the hearer had some measure of sympathy with the statement. They wanted to believe it, then spread it around as worthy of confidence when it never was. I have researched things repeated as fact that were based on statements pulled completely out of context, so that even though the literal quotation was accurate, the intent of the original source was not even remotely what the person repeating it said it was. Again, we have a mix of deliberate and mistaken deceit, but deceit is the end result either way. Questioning what you hear is frequently an exercise in finding out whether anyone else says the same thing, and whether any or all of them can be considered trustworthy. Sometimes, you simply cannot be sure, and you have to file it away as “undetermined.” Which leads to another thought: it’s OK to accept something provisionally, as long as you are open to re-visit it when you have better information. But it’s also OK to reject it provisionally, or just be neutral about it provisionally. We can’t all know everything (which goes back to my first point). If it matters to you at all though, you absolutely must be willing to look into it, not just jump on the bandwagon.

You also need to be able to disassociate your emotional reaction to something from your acceptance of it. Emotions are a shortcut our minds use to help us filter data, but almost all the cognitive biases and logical fallacies that exist hinge on an emotional response. If your emotions are trained wrongly, they are going to react wrongly. In short, not being deceived requires that you not be lazy about what you believe. Just because you always accepted something, and your peers and teachers taught it to you a certain way, it doesn’t mean they got it right and it doesn’t mean you have it right. It is normal and natural to reject a concept that opposes what you already have accepted. But if you do not want to be deceived, you must set aside this emotional response and do your best to verify your beliefs. I started this article saying that everyone holds something in their hearts as true that really is not — and I do mean everyone. Myself, those who have taught me over the years, experts who have made public statements, people who trusted by many … everyone. The question is, are you willing to challenge those things, look at them squarely, and find, to the best of your ability, the truth of the matter? Are you willing to accept you might never get the whole truth, and that’s OK? Are you willing to challenge your own beliefs? If you are not, you have already lost. If anything at all you hold as true is a deception, everything else you base on that belief will be untrue as well.

The bottom line on dealing with deceit is that you can never let yourself become complacent, and you must be willing to keep growing in knowledge and wisdom for as long as you draw breath. Frankly speaking, it’s a bit frightening … some may even find it terrifying. But sticking your head in the sand and denying it is no help either. It is a sad fact that some would rather be deceived than entertain the thought they have to work out the truth themselves. Their fears and insecurities bring them to that place, and they simply do not have the courage to do anything but hide within their house of cards and hope the wind never blows it down. But any sense of security they might feel in that is a false security. And that thought leads to an inevitable conclusion: what if the truth you find is something horrible? This is where people of faith have a huge advantage: they already have some measure of confidence in something bigger than themselves to help. But remember what I wrote earlier: truth faith does not depend on what you feel or believe, it depends on the trustworthiness of what you believe in. A true, real faith has reasons behind it that are more than feelings or self-deceit. It is true that it is easy for people of faith to be deceived in their faith, but it’s also true that if their faith is not misplaced, they can work out the truth and not be deceived by other things. But anyone, whether a person of faith or not, must be looking to find the truth, and not just to appease their own fears, biases, or desires. And everyone needs to ask themselves if that is what they are doing, because if you are unwilling to seek truth and set aside your preconceptions to find it, you are doomed to be deceived and to suffer for it.

This entry was posted in Culture, Ruminations. Bookmark the permalink.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.