The headlines were unsparing and unambiguous. “Anti-gay Stigma Shortens Lives,” wrote US News & World Report.
“Anti-Gay Communities Linked to Shorter Lives,” said Reuters. “LGB Individuals Living in Anti-Gay Communities Die Early,” according to Science Daily.
Two years ago, these stories were hard to ignore when Columbia professor Mark Hatzenbuehler found that gays and lesbians who faced prejudice in their communities had a life expectancy 12 years shorter than those who lived in more accepting areas. Just so we’re clear, that’s bigger than the lifespan gap between regular smokers and nonsmokers.
We always knew prejudice was bad, but an Ivy League researcher had found that there were significant effects on the physical health of those experiencing it.
But where, one might wonder, were the headlines when another researcher tried to replicate Hatzenbuehler’s effects and came up empty?
Last month, Mark Regnerus, a professor at UT Austin, published an article in the journal Social Science and Medicine that concluded that “ten different approaches to multiple imputation of missing data yielded none in which the effect of structural stigma on the mortality of sexual minorities was statistically significant.”
In other words, Regnerus tried seven — er, 10 — ways from Sunday to try to get the same results as Hatzenbuehler using the exact same data, but failed. Which means, he concluded, that “the original study’s . . . variable (and hence its key result) is so sensitive to subjective measurement decisions as to be rendered unreliable.”
Oops. In case you missed it, there has been a “crisis of replication” in the social sciences recently. Or at least it was discovered recently. In 2015, a large initiative called The Reproducibility Project, led by Brian Nosek at the University of Virginia, repeated 100 published psychological experiments and replicated the results of only a third of them.
Anyone familiar with the way academic publication works could have seen this coming. A former editor of the British Medical Journal explained the process of peer review, which determines whether academic research is published: “The editor looks at the title of the paper and sends it to two friends whom the editor thinks know something about the subject. If both advise publication, the editor sends it to the printers. If both advise against publication, the editor rejects the paper. If the reviewers disagree, the editor sends it to a third reviewer and does whatever he or she advises.”
Indeed, in the case of Hatzenbuehler, the author of the study was also the editor of the special issue of the journal in which the study appeared.
While no academic or media outlet has made a peep about Hatzenbuehler in the weeks since Regnerus’ article was published, Regnerus was subject to immediate public excoriation for his findings in a 2012 paper on the effects of same-sex parenting on children, which ran contrary to accepted academic opinion on the subject. Despite calls for his firing, the University of Texas found no wrongdoing. Critics disagreed with his methodology, but there was no mystery about how he arrived at his conclusions.
The difference between Regnerus and Hatzenbuehler is obvious. Certain conclusions are simply more acceptable than others.
Take, for instance, the fraudulent 2014 study in which UCLA’s Michael LaCour was found to have made up out of whole cloth segments of his data on shifting attitudes toward same-sex marriage. LaCour suggested that if someone knocking on doors and asking people about their attitudes toward gay marriage revealed that he himself was gay that would dramatically change the answers of the respondent.
For years, LaCour’s research was cited by major media and used in political campaigns, but it turned out to be what New York magazine’s Jesse Singal called “one of the biggest scientific frauds in recent memory.”
In the end, neither LaCour nor Hatzenbuehler actually did the work to prove their theses — because there would be no real consequences if they were caught, and anyway academia writ large didn’t want to “catch” them at all. Facts be damned. Academics care only about the “narrative.”
?Naomi Schaefer Riley is a senior fellow at the Independent Women’s Forum.