AJPS Editor’s Blog

December 1st marked our first six months as co-editors of AJPS. We want to thank our associate editors, board members, authors, and reviewers for a very smooth transition. Now that we have some experience under our belts, we thought we’d offer some thoughts about several topics that might be helpful to reviewers and authors alike. So on the occasional Tuesday, we’ll post a short entry about some aspect of the journal submission, review, or publication process that we’ve had to address over the course of the last six months. While these issues are probably relevant to most journals, we only speak for ourselves and our expectations for AJPS.

Passing Verification with Flying Colors

For nearly five years now, AJPS has verified quantitative manuscripts. And last year, we verified our first qualitative paper as well. As most of you know, this doesn’t mean that we replicate authors’ analyses. It “simply” means that we reproduce – using the authors’ data, code, and/or interview transcripts – the results presented in the paper. Notice that simply is in quotes. That’s because most papers don’t pass verification by the Odum Institute in the first round. Sometimes the data set won’t open. Sometimes the computing clusters where the authors conducted the analyses can’t “talk” to the cluster where the analyses are being verified. Sometimes the code includes typos or other minor errors. Sometimes it’s all of the above. Regardless of the problem, though, when a paper can’t be verified, it goes back to the author for revision and resubmission. Multiple rounds of back and forth are, unfortunately, not uncommon.

Given that authors now conduct increasingly complex analyses that regularly rely on multiple data sets, we figured that this would be a good time to offer three helpful hints that can make the verification process less cumbersome and more efficient:

  1. Verify your own analyses first. Before uploading your data set and code to Dataverse, run it and see if you get the exact same results as you report in the paper. Sounds obvious, right? Well, it turns out that most authors actually don’t do this. But those who do have reported a relatively seamless verification process – they’ve caught glitches, typos, and bugs. If you can perform this task on a different computer than you typically use, even better. That increases the probability that the code will run when Odum begins the verification.
  2. Keep the code as slim as possible. After a paper has been verified, if you make any changes that involve the verified results, then the whole thing must go through the process again. Let’s say you want to modify a figure’s title and source note when you receive the page proofs. Well, if you generated the title and note in Stata or R and it’s part of the code, then the whole paper needs to go back to Odum. To avoid unnecessary delays, try to remove from the code anything that’s not part of the actual results or formatting of the figure itself. If you can do it in Word, take it out of the code.
  3. Talk to the IT person at your institution. If you know that your analyses have caused you any trouble or required somewhat unusual accommodations – like a super-computer or major technical assistance merging data sets – then prepare a document that summarizes how you solved these problems and the specifications of the computing environment you used. That short memo will go a long way helping Odum determine the best way to verify the analyses without having to troubleshoot all of the issues you already confronted.

We know that the verification process is clunky. But taking these three steps will undoubtedly make it a lot smoother and much faster. It’ll also make it that much easier when you want to verify someone else’s results.

Speak Your Mind

*

 

The American Journal of Political Science (AJPS) is the flagship journal of the Midwest Political Science Association and is published by Wiley.

%d