A couple articles have recently been making the rounds about which politicians are the most honest. Using data acquired from PolitiFact — the political fact-checking institute owned by the Tampa Bay Times – American News X has now run two opinion pieces by Charlie Barrel on the topic, and they have both gone semi-viral among the site’s target left-of-center demographic.
In the first article, Barrel created what he calls an “Honesty Index” based on the number of statements that are true versus the number of statements that are false, according to PolitiFact’s scoring system. The result, according to Barrel, is that most of the liars are Republicans, while most of the honest people are Democrats.
In the second article, Barrel bemoans the fact that Republicans in general (though not solely) find Hillary Clinton inherently dishonest. Again using PolitiFact data, Barrel this time attempts to prove that Hillary is more honest than Donald Trump.
While Barrel’s analysis is interesting in its own way, I’m not sure it does what he claims. In fact, thinking it through a bit, there are several problems with his analysis.
What Does PolitiFact Measure?
There is an inherent assumption in Barrel’s first article that PolitiFact is measuring honesty; however, what PolitiFact actually measures is correctness. Some people may believe that these are the same thing, and certainly the words like “true” and “honest” are often used synonymously in everyday language, as are “untrue” and “lie.”
However, the difference becomes clear when you consider that someone can easily make a false statement, but believe wholeheartedly that it is true. That person is not lying, they are simply wrong. Their wrongness can be the result of ignorance or misinformation, or possibly even a misstatement (i.e., they knew the right thing, but merely stated it incorrectly without realizing it).
Likewise, people frequently use the truth to mislead other people. True statements are themselves objective things: Like any tool, people can use them in earnest or manipulatively. There are quote a number of good quotes out there about the fact that the best lies contain at least some truth.
Which means that lies themselves have nothing to do with accuracy. Rather, they have to do with the intent to mislead. On the flip-side, honest statements – that is, statements made in earnest — are reflections of probity and sincerity, and are unrelated to the factual accuracy of the statement.
When PolitiFact – or any fact-checking organization – analyzes a fact, it can only comment on a statement’s factual accuracy, not whether the statement was told deceptively or sincerely. Except in the event when a politician admits to actually lying, the intent behind a statement is nearly impossible to discern. It certainly is not something that a presumably objective media organization should attempt to do.
Of course, part of the confusion is that PolitiFact’s own rating system uses objective measurements for all but one of its ratings: Pants on Fire. That phrase is associated with a longer phrase, which begins “Liar, Liar.” Without that rating, the rest of the scale could be viewed as a simple measure of accuracy.
But even the description of “Pants on Fire” shows that it has nothing to do with lying, per se. The description given by PolitiFact for “Pants on Fire” statements is:
The statement is not accurate and makes a ridiculous claim.
Again, we’re talking about accuracy here, not honesty. The addition of a “ridiculous” component does not automatically equate to dishonesty: Some of the most ridiculous claims I’ve heard have been made by people who were 100% sincere.
In PolitifFact’s defense, it never claims that it is talking about honesty (aside from the connotation of its “Pants on Fire” rating). In fact, as a response to a 2013 study by the Center for Media and Public Affairs at George Mason University, PolitiFact explicitly denied any attempt to rate the honesty of politician.
Only Barrel suggests that PolitiFact is measuring honesty – which makes one wonder whether he himself is being dishonest, or merely ill-informed.
Okay, But What About Accuracy?
Even if PolitiFact is measuring accuracy rather than honesty, doesn’t it mean something that Democrats are accurate more often than Republicans? Even if Barrel is mistaken about what the data is measuring, the measurement itself could be interesting.
Perhaps, though that brings up the question of how trustworthy is PolitiFact’s data itself. There are several things to consider.
First of all, it is not clear that there is an equal amount of scrutiny going on between every candidate. As Barrel points out in his second article, PolitiFact has rated nearly 200 statements made by Trump in the last year alone, while it has rated only 225 statements by Clinton over the last 9 years. To compare, Barack Obama has had 588 statements analyzed by PolitiFact since 2007, more than twice as many as Clinton. It seems likely that if PolitiFact were scrutinizing every politician at the same rate, many more of Hillary’s statements would have been analyzed in the time period.
The question then becomes one of selection. In 2009, PolitiFact Editor Bill Adair stated on C-SPAN, “We choose to check things we are curious about. If we look at something and we think that an elected official or talk show host is wrong, then we will fact-check it.” That certainly seems to be a curious selection methodology, choosing statements one believes are already wrong, and could certainly introduce a certain level of bias.
It appears that the site’s editorial methodology has been updated somewhat since then. On the site’s Principles of PolitiFact page, posted in 2013, they lay out five criteria for choosing statements to analyze:
- Verifiability
- Misleadingness…
- Significance
- Viralness (i.e., likelihood of being repeated)
- Whether a typical person would wonder if a statement is true
While somewhat more rigorous than Adair’s C-SPAN description of the process, there are still some problematic points. Interestingly, there is still a criterion that the statement itself seems misleading on face value. There’s also a criterion in which they judge whether or not a “typical” person will wonder about a statement’s truth – which is pretty vague criterion to use, all things considered.
In fact, except for the first criterion of verifiability, this set of criteria leaves a open a large possibility of selection bias. If one criterion is whether a statement could be taken as misleading before one even checks into the facts of the statement, then no statements that seem true are ever going to be analyzed. Any bias of the editors and journalists working for the site is going to be pretty clear right up front based on the types of statements they even choose.
Then there’s the criterion of guessing what a “typical” person might wonder is true. Considering that journalists who specialize in political fact checking are not “typical” people (at least, with respect to political facts…), that seems a curious, and possibly undesirable, criterion to use. It would make much more sense to ask “typical” people what statements they think should be analyzed than to guess.
Potentially more problematic, though, is the potential for bias. Remember up above where I subtly pointed out that PolitiFact is owned by the Tampa Bay Times? Well, as it turns out, TBT has a pretty significant left-leaning bias. By their own admission in 2010, the paper has never – since its inception as weekly paper in 1884 – endorsed a Republican presidential candidate (though it has endorsed some Republicans for state offices). (Hint: It continued its streak in 2012 by endorsing Obama a second time.)
So, when Barrel points out that PolitiFact generally rates the statements of Republicans as false, while it rates the statements of Democrats as true, that has to be understood with the knowledge that:
- PolitiFact only selects statements it already thinks are likely to be false
- PolitiFact does not analyze statements from politicians at the same rate
- PolitiFact’s parent company has never endorsed a Republican presidential candidate
Taken together, these three things indicate to me that there is a problem with placing too much trust in PolitiFact as an accurate measure of politicians’ accuracy.
How Does Barrel Rate?
Given my analysis above, I can’t help but rate Barrel’s claims on the same Truth-O-Meter scale that he has put so much trust in.
So, let’s see:
- He relies on data from a website that seems to have a pretty significant bias problem.
- He is wrong about what PolitiFact actually measures.
- He does not disclose the actual calculation of his “Honesty Index,” which he describes as a “weighted average” without explaining the weighting or even what he is averaging (considering the Truth-O-Meter doesn’t actually have any numerical values attached to it…)
Given the factual inaccuracies and the wild claim about what PolitiFact measures, I’m going to have to go with: