PolitiFact assigns "Pants on Fire" or "False" ratings to 39 percent of Republican statements compared to just 12 percent of Democrats since January 2010.
PolitiFact, the high profile political fact-checking operation at the St. Petersburg Times, has been criticized by those on the right from time to time for alleged bias in its grading of statements made by political figures and organizations.
The organization (and now its more than a half dozen state offshoots) grades statements made by politicians, pundits, reporters, interest groups, and even the occasional comedian (anyone 'driving the political discourse') on a six point "Truth-O-Meter" scale: True, Mostly True, Half True, Barely True, False, and Pants On Fire for "ridiculously" false claims.
But although PolitiFact provides a blueprint as to how statements are rated, it does not detail how statements are selected.
For while there is no doubt members of both political parties make numerous factual as well as inaccurate statements - and everything in between - there remains a fundamental question of which statements (by which politicians) are targeted for analysis in the first place.
A Smart Politics content analysis of more than 500 PolitiFact stories from January 2010 through January 2011 finds that current and former Republican officeholders have been assigned substantially harsher grades by the news organization than their Democratic counterparts.
In total, 74 of the 98 statements by political figures judged "false" or "pants on fire" over the last 13 months were given to Republicans, or 76 percent, compared to just 22 statements for Democrats (22 percent).
First, it should be acknowledged that the number of public officials subjected to PolitiFact's Truth-O-Meter lens from each party is fairly even during the period under analysis.
Of the 511 statements put through the Truth-O-Meter test from January 1, 2010 through January 31, 2011, PolitiFact devoted 74 percent of its attention to current and former political officeholders and elected officials (379 statements), 17 percent to ideological organizations and individuals not holding political office (85 statements), and 9 percent to other groups and individuals without a partisan or ideological agenda (28 statements). Another 20 statements came from chain e-mails, public opinion polls, bumper stickers, or "bloggers" generally (4 percent).
For those current or former political officeholders, PolitiFact has generally devoted an equal amount of time analyzing Republicans (191 statements, 50.4 percent) as they have Democrats (179 stories, 47.2 percent), with a handful of stories tracking statements by independents (9 stories, 2.4 percent).
Assuming for the purposes of this report that the grades assigned by PolitiFact are fair (though some would challenge this assumption), there has nonetheless been a great discrepancy regarding which political parties' officials and officeholders receive the top ratings and those that are accused of not telling the truth.
Republican statements were graded in the dreaded "false" and "pants on fire" categories 39 percent of the time, compared to just 12 percent for statements made by Democrats.
Republicans were also assigned a larger percentage of "Barely True" statements than Democrats, bringing the tally of all falsehoods or near falsehoods in the bottom three categories to 52.9 percent of Republican statements to just 24.6 percent of those made by Democrats.
That means a supermajority of falsehoods documented by PolitiFact over the last year - 76 percent - were attributed to Republicans, with just 22 percent of such statements coming from Democrats.
As a consequence, Democrats have therefore been presented as much more truthful - with over 75 percent of statements receiving the top three grades of True (16 percent), Mostly True (27 percent), or Half True (33 percent).
Less than half of Republican statements graded by PolitiFact were regarded as half truths or better - just 90 out of 191 (47 percent).
PolitiFact Ratings of Current and Former Political Officials, January 2010 - January 2011
Whereas Boehner received six "True," two "Mostly True," and one "Half True" ratings during this span, Pence and the NRCC received none in these categories, Bachmann only two, and Palin just four.
What is particularly interesting about these findings is that the political party in control of the Presidency, the US Senate, and the US House during almost the entirety of the period under analysis was the Democrats, not the Republicans.
And yet, PolitiFact chose to highlight untrue statements made by those in the party out of power.
But this potential selection bias - if there is one at PolitiFact - seems to be aimed more at Republican officeholders than conservatives per se.
An examination of the more than 80 statements PolitiFact graded over the past 13 months by ideological groups and individuals who have not held elective office, conservatives only received slightly harsher ratings than liberals.
These findings beg the central unanswered question, and that is what is the process by which PolitiFact selects the statements that it ultimately grades?
When PolitiFact Editor Bill Adair was on C-SPAN's Washington Journal in August of 2009, he explained how statements are picked:
"We choose to check things we are curious about. If we look at something and we think that an elected official or talk show host is wrong, then we will fact-check it."
If that is the methodology, then why is it that PolitiFact takes Republicans to the woodshed much more frequently than Democrats?
One could theoretically argue that one political party has made a disproportionately higher number of false claims than the other, and that this is subsequently reflected in the distribution of ratings on the PolitiFact site.
However, there is no evidence offered by PolitiFact that this is their calculus in decision-making.
Nor does PolitiFact claim on its site to present a 'fair and balanced' selection of statements, or that the statements rated are representative of the general truthfulness of the nation's political parties or the elected officials involved.
In defending PolitiFact's "statements by ruling" summaries - tables that combine all ratings given by PolitiFact to an individual or group - Adair explained:
"The media in general has shied away from fact checking to a large extent because of fears that we'd be called biased, and also because I think it's hard journalism. It's a lot easier to give the on-the-one-hand, on-the-other-hand kind of journalism and leave it to readers to sort it out. But that isn't good enough these days. The information age has made things so chaotic, I think it's our obligation in the mainstream media to help people sort out what's true and what's not."
The question is not whether PolitiFact will ultimately convert skeptics on the right that they do not have ulterior motives in the selection of what statements are rated, but whether the organization can give a convincing argument that either a) Republicans in fact do lie much more than Democrats, or b) if they do not, that it is immaterial that PolitiFact covers political discourse with a frame that suggests this is the case.
In his August 2009 C-SPAN interview, Adair explained how the Pants on Fire rating was the site's most popular feature, and the rationale for its inclusion on the Truth-O-Meter scale:
"We don't take this stuff too seriously. It's politics, but it's a sport too."By levying 23 Pants on Fire ratings to Republicans over the past year compared to just 4 to Democrats, it appears the sport of choice is game hunting - and the game is elephants.