Research replication in social science: reflections from Nathaniel Beck
Introduction from Michael Alvarez, co-editor of Political Analysis:Questions about data access, research transparency and study replication have recently become heated in the social sciences. Professional societies and research journals have been scrambling to respond; for example, the American Political Science Association established the Data Access and Research Transparency committee to study these issues and to issue guidelines and recommendations for political science. At Political Analysis, the journal that I co-edit with Jonathan N. Katz, we require that all of the papers we publish provide replication data, typically before we send the paper to production. These replication materials get archived at the journal’s Dataverse, which provides permanent and easy access to these materials. Currently we have over 200 sets of replication materials archived there (more arriving weekly), and our Dataverse has seen more than 13,000 downloads of replication materials.
Due to the interest in replication, data access, and research transparency in political science and other social sciences, I’ve asked a number of methodologists who have been front-and-center in political science with respect to these issues to provide their thoughts and comments about what we do in political science, how well it has worked so far, and what the future might hold for replication, data access, and research transparency. I’ll also be writing more about what we have done at Political Analysis.
The first of these discussions are reflections from Nathaniel Beck, Professor of Politics at NYU, who is primarily interested in political methodology as applied to comparative politics and international relations. Neal is a former editor of Political Analysis, chairs our journal’s Advisory Board, and is now heading up the Society for Political Methodology’s own committee on data access and research transparency. Neal’s reflections provide some interesting perspectives on the importance of replication for his research and teaching efforts, and shed some light more generally on what professional societies and journals might consider for their policies on these issues.
Research replication in social science: reflections from Nathaniel Beck
Replication and data access has become a hot topic throughout the sciences. As a former editor of Political Analysis and the chair of the Society for Political Methodology‘s Data Access and Research Transparency (DA-RT) committee, I have been thinking about these issues a lot lately. But here I simply want to share a few recent experiences (two happy, one at this moment less so) which have helped shape my thinking on some of these issues. I note that in none of these cases was I concerned that the authors had done anything wrong, though of course I was concerned about the sensitivity of results to key assumptions.
The first happy experience relates to an interesting paper on the impact of having an Islamic mayor on educational outcomes in Turkey by Meyerson published recently in Econometrica. I first heard about the piece from some students, who wanted my opinion on the methodology. Since I am teaching a new (for me) course on causality, I wanted to dive more deeply into the regression discontinuity design (RDD) as used in this article. Coincidentally, a new method for doing RDD was presented at the recent (2014) meetings of the Society for Political Methodology by Facebook experiment on social contagion. The authors, in a footnote, said that replication data was available by writing to the authors. I wrote twice, giving them a full month, but heard nothing. I then wrote to the editor of PNAS who informed me that the lead author had both been on vacation and was overwhelmed with responses to the article. I am promised that the check is in the mail.
What editor wants to be bothered by fielding inquiries about replication data sets? What author wants to worry about going on vacation (and forgetting to set a vacation message)? How much simpler the world would have been for the authors, editor, and me, if PNAS simply followed the good practice of Political Analysis, the American Journal of Political Science, the Quarterly Journal of Political Science, Econometrica, and (if rumors are correct) soon the American Political Science Review of demanding that authors post, either on the journal web site or the journal Dataverse, all replication materials before an article is actually published? Why does not every journal do this?
A distant second best is to require authors to post their replication on their personal website. As we have seen from my experience, this often leads to lost or non-working URLs. While the simple solution here is the Dataverse, surely at a minimum authors should provide a standard Document Object Identifier (DOI) which should persist even as machine names change. But the Dataverse solution does this, and so much more, that it seems odd in this day and age for all journals not to use this solution. And we can all be good citizens and put our own pre-replication standard datasets on our own Dataverses. All of this is as easy (and maybe) easier than maintaining private data web pages, and one can rest easy that one’s data will be available until either Harvard goes out of business or the sun burns out.
Featured image: BalticServers data center by Fleshas CC-BY-SA-3.0 via Wikimedia Commons.
The post Research replication in social science: reflections from Nathaniel Beck appeared first on OUPblog.










Oxford University Press's Blog
- Oxford University Press's profile
- 238 followers
