Wednesday, October 22, 2008

Anti-Americanism

Unimpressed by AJS or ASR from time to time? Wonder how an article with glaring methodological errors made it to the top, or shrug helplessly when your work is misquoted in our flagship journals? Economist John Ioannidis has an explanation.

The theory of the “winner’s curse” predicts the highest bidder in an auction has likely paid too much. The “true” value of the item being auctioned is closest to the period where the greatest number of bids are made. And so it is for journal articles, according to Ioannidis. In their eagerness to publish cutting edge research, Editors routinely sacrifice standards of replicability necessary for robust scientific findings. Because cutting-edge fields are usually smaller than other parts of the discipline, Editors are often left without a choice in the matter. Cutting edge fields are also characterized by greater flexibility in research designs, definitions, outcomes, and analytical modes, of which there is little consensus within such tiny academic tribes.

Analyzing a series of popular articles in medical journals with more than 1,000 citations, Ioannidis finds that most “breakthrough” theories are refuted within a few years time. To make matters worse, once cutting-edge theories become widely accepted, Ionnidis finds that most subsequent studies are simply reflections of the prevailing bias. Because flagship journals are read more frequently than others, these errors are seldom corrected within the mainstream of academic fields.

A counter-argument can be made for the very visibility of flagship journals creating increased imperative for replication and scholarly oversight. Still, if these critical errors are not corrected in high circulation journals, the danger persists that false findings will remain seminal references in a given literature.

So does this apply to sociology? Setting aside tired questions about the balkanization of the discipline, I do not believe Ionnidis’ logic holds for our field—or at least cultural sociology. Insofar as we are interested in uncovering the relationships between actors that shape the variables of interest for other fields, we avoid the sub rosa paradox. Yet how often are “process-models” replicated? Surely well-farmed network data has produced breakthrough findings that were later refuted with more sophisticated methods.

But what of our many qualitative studies? Here, we rarely even make our data publicly available. Nor do we replicate our instruments across time and space, leaving a long parade of graduate students to question the foundation of the research question, instead of the findings themselves. And how can we blame them? In a market with greater returns for marketability than meta-analysis, will cultural sociology go the way of cultural studies?

1 comment:

  1. Interesting. Although I'd like to think the Editors at AJS and ASR are more competent than those at QJE or AJPS.

    ReplyDelete