“The Navy doesn’t have F-16s.”
I cannot tell you how many times I’ve read this on social media. Seems like every time I post a photo or video from my time flying the U.S. Navy’s dozen or so F-16As and Bs based at NAS Fallon, Nevada, someone responds with this proclamation.
What I find sadly amusing is not the respondents’ misinformation but rather their conviction. It’s almost never a question. The sky is blue. The Pope is Catholic. The Navy does not fly F-16s. Period. One gentleman took it a step further responding to a short Instagram clip I posted, “I assume you stole this video from someone else and have no idea what you’re talking about.”
Admittedly, the Air Force is the predominant U.S. operator of the F-16—both numerically and operationally—but the Navy has maintained a small fleet of “Vipers” off and on since the N model debuted in the mid-1980’s. Turns out the little jet makes a capable, cost-effective adversary for air-to-air training that is an absolute blast to fly (see Paul Nickell’s compelling article from The Warzone on what it was like to fly the F-16N).
As oddly satisfying as it is to correct people when they respond so determinedly, this whole ordeal has me pondering the more relevant question of why folks are wrong in the first place. With a little digging, I discovered the reasons are many. One is that our information is simply flawed—incomplete or inaccurate. An example of this is anyone who made or believes the assertion opening this article.
Another is false memories. Research suggests humans are prone to creating memories of events that never happened or did so differently than how they are “remembered.” Naturally, the longer ago the event, the more susceptible it is to distortion.
But the most insidious reason we are sometimes wrong is ‘expectation bias,’ which is where we observe events not as they are but as we expect them to be based on our uniquely individual biases. These “filters” are developed over time as a result of our upbringing and experiences. We all have biases that influence how we perceive events. (This one frightens me a little.)
Each of us is prone to making any of these types of mistakes and we all seem to have distinctive ways of reacting to correction. When I politely inform my social media friends of their oversight, I get everything from silence to indignation to genuine delight—usually sprinkled with humor. A very small minority doubles down and firmly holds their ground, proving that pride, indeed, is deservedly one of the ‘Seven Deadly Sins.’
The takeaway to all this, however, is the troubling revelation that, surely, I too fall victim to this fallacy. How many times have I been wrong about something when I was so sure I was right? Did I, in other contexts, equivalently declare the U.S. Navy absolutely does not fly the F-16? And to an authoritative show focused on military aviation, no less?! Most assuredly.
I think it’s safe to say we have all been wrong at some point and, likely, will be again. Soon. Since we cannot change the past, the best we can do is to try to minimize the damage of such lapses in the future. But how?
For starters, we should be mindful of the various reasons people are sometimes wrong in the first place, as described earlier. Unless we are the undisputed, industry-leading subject matter experts on the topic, we should consider our information at least partially suspect. And even “experts” are occasionally wrong.
Second, we would do well to consider the setting and our audience. Before commenting that a set of mounted antlers are from a moose—not an elk, for example, I may want to note that I am responding on the Taxidermy Today Magazine’s Instagram page. After all, one might reasonably expect an organization with such a name to know what they’re talking about.
And finally, as with so many things in life: delivery is key. As important as decisiveness is in certain matters, in others it is better to take a softer approach. Instead of the declarative “the Navy doesn’t have F-16s,” a simple sentence restructuring comes across far less brutish. “The Navy has F-16’s?” or, “I didn’t know the Navy has F-16s” are simple ways to invite discussion instead of invoking a visceral response.
Another technique is to lead with an ‘out’ such as, “I could be mistaken, but…” or “Correct me if I’m wrong, but….” These phrases help save face by inviting participation and correction in the event our information is less than perfect. Even better, such an approach is much easier on the ego if (when) we end up being wrong, which we are all sure to be once in awhile.
In fact, I could be wrong at this very moment….