The New AJPS Editorial Team Starts Today! Here Are Our Four Central Goals

By AJPS Co-Editors Kathy Dolan and Jennifer Lawless

Today marks the day! The new editorial team at AJPS is up and running. We’re honored to serve the discipline this way and we’re excited about what the next four years have in store. Before anything else, we want to introduce the new team:

Co-Editors-in-Chief:

Kathleen Dolan, University of Wisconsin Milwaukee
Jennifer Lawless, University of Virginia

Associate Editors:
Elizabeth Cohen, Syracuse University
Rose McDermott, Brown University
Graeme Robertson, University of North Carolina
Jonathan Woon, University of Pittsburgh

You can take a look at the new Editorial Board here. We are thrilled that such an impressive, well-rounded, diverse group of scholars agreed to serve.

Over the course of the coming days and weeks, we’ll use this blog to call your attention to new policies and procedures. (Don’t worry – for the most part, processes won’t change!) But we want to take a few minutes now to highlight four central goals for our term.

STABILITY: AJPS has undergone a lot of transitions in a short period of time. And we’re grateful to the interim team for stepping up on short notice last year and working tirelessly to ensure that the journal would continue to thrive. But now we’ve got a permanent team in place for the next four years and are eager to provide the stability the journal needs.

TRANSPARENCY: We’re committed to managing a process that maintains transparency and academic rigor. We will accomplish this, in part, by maintaining the current system of data verification and the professional and personal conflict of interest policy. We will also require authors of work based on human subjects to confirm institutional IRB approval of their projects at the time a manuscript is submitted for consideration. And we’ll be vigilant about ensuring that authors are authorized to use – at the time of submission – all data included in their manuscripts.

DIVERSITY: As scholars of gender politics, we are well aware of the ways in which top journals do not always represent the diversity of a discipline. In putting together our team of Associate Editors and our Editorial Board, we have intentionally worked to represent race, sex, subfield, rank, institutional, and methodological diversity. It is our hope that the presence and work of these leaders sends a message to the discipline that we value all work and the work of all.  We want to be as clear as possible, though, that our plan to diversify the works and the scholars represented in the journal in no way compromises our commitment to identifying and publishing the best political science research. Indeed, we believe that attempts at diversification will actually increase the odds of identifying the best and most creative work.

OPEN COMMUNICATION: The journal’s success is contingent on the editorial team, authors, reviewers, and the user-community working together. In that vein, we value open communication. Undoubtedly, you won’t love everything we do. Maybe you’ll be upset, disappointed, or troubled by a decision we make. Perhaps you’ll disagree with a new policy or procedure. Please contact us and let us know. We can likely address any concerns better through direct communication than by discerning what you mean in an angry tweet. We get that those tweets will still happen. But we hope you’ll feel comfortable contacting us directly before your blood begins to boil.

Before we sign off, we want to let you know that we’re aware that, for some people, earlier frustration with the MPSA had bled over into AJPS. We ask for your continued support and patience as the new MPSA leadership addresses issues of concern and seeks to rebuild your trust. We ask that you not take your frustrations out on the journal by refusing to submit or review. A journal can only function if the community is invested in it.

Thanks in advance for tolerating the transition bumps and bruises that are sure to occur. We’ll try to minimize them; we promise.

Kathy and Jen

Verification, Verification

By Jan Leighley, AJPS Interim Lead Editor  

After nine months of referring to the AJPS “replication policy,” or (in writing) “replication/verification” policy, I finally had to admit it was time for a change. As lead editor, I had been invited to various panels and workshops where I noticed that the terms “replication”, “verification”, and “reproducibility” were often used interchangeably (sometimes less awkwardly than others), and others where there were intense discussions about what each term meant or required.

Spoiler Alert: I have no intention, in the context of this post, with 10 days left in the editorial term, to even begin to clarify the distinctions between reproducibility, replicability, and verifiability—and how these terms apply to data and materials, in both qualitative and quantitative methods.

A bit of digging in the (surprisingly shallow) archives suggested that “replication” and “verification” had often been used interchangeably (if not redundantly) at AJPS. Not surprising, given the diversity of approaches and terminology used in the natural and social sciences more broadly (See “Terminologies for Reproducible Research” at arXiv.org). But in a 2017 Inside Higher Education article, “Should Journals Be Responsible for Reproducibility?”, former editor Bill Jacoby mentioned that the AJPS “Replication and Verification Policy” terminology would soon be adjusted to be consistent that of the National Science Foundation. From the article: “Replication is using the same processes and methodology with new data to produce similar results, while reproducibility is using the same processes and methodology on the same dataset to produce identical results.”

It made sense to me that a change in names had been in the making, in part due to the important role of the AJPS as a leader in the discipline, social sciences, and possibly natural sciences on issues of transparency and reproducibility in scientific research. While I had no plans as interim editor to address this issue, the publication of the journal’s first paper relying on (verified) qualitative research methods required that the editorial team review the policy and its procedures. That review led to a consideration of the similarities and differences in verifying quantitative and qualitative papers for publication in the AJPS—and my decision to finally make the name change “legal” after all this time: the “AJPS Replication & Verification Policy” that we all know and love will now move forward in name officially as theAJPS Verification Policy“.

This name change reflects my observation that what we are doing at AJPS currently is verifying what is reported in the papers that we publish, though what we verify differs for qualitative and quantitative approaches. In neither case do we replicate the research of our authors.

Do note that the goals and procedures that we have used to verify the papers we publish will essentially remain the same, subject only to the routine types of changes made as we learn how to improve the process, or with the kind of adjustments that come with changes of editorial teams. Since the policy was announced in March 2015, The Odum Institute has used the data and materials posted on the AJPS Dataverse to verify the analyses of 195 papers relying on quantitative analyses.

Our experience in verifying qualitative analyses, in contrast, is limited at this point to only one paper, one that the Qualitative Data Repository verified early this spring, although several others are currently under review. As in the case of quantitative papers, the basic procedures and guidelines for verification of qualitative papers have been posted online for several years. We will continue to develop appropriate verification procedures, as we build on our limited experience thus far, and respond to the complexity and heterogeneity of qualitative research methods. Authors of accepted papers (or those who are curious about verification procedures) should check out the guidelines and checklists posted at www.ajps.org to learn more.

For those who care about graphics more than terminology (!), I note that a few changes have been made to the badges awarded to verified articles. I’ve never been a badge person myself, but apparently this is the currency of the realm in open science circles, and some research suggests that by awarding these badges, researchers are more likely to follow “open science” practices in their work. AJPS is proud to have our authors’ papers sport these symbols of high standards of transparency in the research process on our Dataverse page and on our published papers. Our badge updates include the addition of the words “peer review” to reflect that our verification policy relies on external reviewers (i.e., Odum, QDR) to document verifiability rather than doing it in-house, the most distinctive aspect of the AJPS Verification Policy. It also includes a new “Protected Access” badge that will signify the verification of data that is available only through application to a protected repository, as identified by the Center for Open Science. As new papers are accepted for publication, you will begin to see more of the new badges, along with revised language that reflects more precisely what those badges represent.

Cheers to replication, verification—and the end of the editorial term!
Jan (Sarah, Mary, Jen, Layna and Rocio)


Citation:
Jacoby, William G., Sophia Lafferty-Hess, Thu-Mai Christian. 2017. “Should Journals Be Responsible for Reproducibility?” Inside Higher Education [blog], July 17.

Our Experience with the AJPS Transparency and Verification Process for Qualitative Research

“As the editorial term ends, I’m both looking back and looking forward . . . so, as promised, here’s a post by Allison Carnegie and Austin Carson describing their recent experience with qualitative verification at AJPS . . . and within the next week I’ll be posting an important update to the AJPS “Replication/Verification Policy,” one that will endure past the end of the term on June 1.”
– Jan Leighley, AJPS Interim Editor


Our Experience with the AJPS Transparency and Verification Process for Qualitative Research

By Allison Carnegie of Columbia University and Austin Carson of the University of Chicago

The need for increased transparency for qualitative data has been recognized by political scientists for some time, sparking a lively debate about different ways to accomplish this goal (e.g., Elman, Kapiszewski and Lupia 2018; Moravcsik 2014. As a result of the Data Access and Research Transparency (DA-RT) initiative and the final report of the Qualitative Transparency Deliberations,  many leading journals including the AJPS adopted such policies. (Follow this link for a critical view of DA-RT.) While the AJPS has had such a policy in place since 2016, ours was the first article to undergo the formal qualitative verification process. We had a very positive experience with this procedure, and want to share how it worked with other scholars who may by considering using qualitative methods as well.

In our paper, “The Disclosure Dilemma: Nuclear Intelligence and International Organizations (https://doi.org/10.1111/ajps.12426),” we argue that states often wish to disclose intelligence about other states’ violations of international rules and laws, but are deterred by concerns about revealing the sources and methods used to collect it. However, we theorize that properly equipped international organizations can mitigate these dilemmas by analyzing and acting on sensitive information while protecting it from wide dissemination. We focus on the case of nuclear proliferation and the IAEA in particular. To  evaluate  our claims, we couple a formal model with a qualitative analysis using each case of nuclear proliferation, finding that strengthening the IAEA’s intelligence protection capabilities led to greater intelligence sharing and fewer suspected nuclear facilities. This analysis required a variety of qualitative materials including archival documents, expert interviews, and other primary and secondary sources.

To facilitate the verification of the claims we made using these qualitative methods, we first gathered the raw archival material that we used, along with the relevant excerpts from our inter- views, and posted them to a dataverse location. The AJPS next sent our materials to the Qualitative Data Repository (QDR) at Syracuse University, which reviewed our Readme file, verified the frequency counts in our tables, and reviewed each of our evidence-based arguments related to our theory’s mechanisms (though it did not review the cases in our Supplemental  Appendix). (More details for this process can be found in the AJPS Verification and Replication policy, along with its Qualitative Checklist.) QDR then generated a report which identified statements that it deemed were “supported,” “partially supported,” or “not documented/referenced.” For the third type of statement, we were asked to do one of the following: provide a different source, revise the statement, or clarify whether we felt that QDR misunderstood our claim. We were free to address the other two types of statements as we saw fit. While some have questioned the feasibility of this process, in our case it took roughly the same amount of time that verification processes of quantitative data typically do, so it did not delay the publication of our article.

We found the report to be thorough, accurate, and helpful. While we had endeavored to support our claims fully in the original manuscript, we fell short of this goal on several counts, and fol- lowed each of QDR’s excellent recommendations. Occasionally, this involved a bit more research, but typically this resulted in us clarifying statements, adding details, or otherwise improving our descriptions of, say, our coding decisions. For example, QDR noted instances in which we made a compound claim but the referenced source only supported one of the claims. In such a case, we added a citation for the other claim as well. We then drafted a memo detailing each change that we made, which QDR then reviewed and responded to within a few days.

Overall, we were very pleased with this process. This was in no small part due to the AJPS editorial team, whose patience and guidance in shepherding us through this procedure were greatly appreciated. As a result, we believe that the verification both improved the quality of evidence and better aligned our claims with our evidence. Moreover, it increased our confidence that we had clearly and accurately communicated with readers. Finally, archiving our data will allow other scholars to access our sources and evaluate our claims for themselves, as well as potentially use these materials for future research. We thus came away with the view that qualitative transparency is achievable in a way that is friendly to researchers and can improve the quality of the work.

About the Authors: Allison Carnegie is Assistant Professor of Columbia University and Austin Carson is Assistant Professor at the University of Chicago. Their research, “The Disclosure Dilemma: Nuclear Intelligence and International Organizations (https://doi.org/10.1111/ajps.12426),” is now available in Early View and will appear in a forthcoming issue of the American Journal of Political Science. Carnegie can be found on Twitter at
@alliecarnegie and Carson at @carsonaust.

References

Elman, Colin, Diana Kapiszewski and Arthur Lupia. 2018. “Transparent Social Inquiry: Implica- tions for Political Science.” Annual Review of Political Science 21:29–47.

Moravcsik, Andrew. 2014. “Transparency: The Revolution in Qualitative Research.” PS: Political Science & Politics 47(1):48–53.

On Manuscript Preparation, Salami-Slicing, and Professional Standards

By Jan Leighley, AJPS Interim Lead Editor  

One of the most challenging (and potentially mind-numbing) tasks that occurs in the inner sanctum of the editorial office is the veritable “technical check.” Even mentioning this work might trigger some unpleasant memories for colleagues who previously served as graduate assistants for AJPS editors over the past several decades. It might also remind those who recently submitted manuscripts of the long checklist of required “to-do’s” that, if not met, delays the long-anticipated start of the peer review process.

But the requirements of manuscript preparation focusing on the mechanics (e.g., double-spacing, complete citations, word limits, etc.) are only part of what editors and reviewers are dependent on authors for. Beyond the detailed items that staff can verify, editors expect that authors follow our “Guidelines for Preparing Manuscripts,” including not submitting manuscripts that are under review elsewhere; not including material that has already been published elsewhere; or not having been reviewed previously at the AJPS. Before submitting your next paper, take a fresh look at the long list of expectations for manuscript preparation and manuscript submissions at www.ajps.org, as that list of requirements seems to grow ever longer with every editorial term—and the new editorial team will likely update that list as they see fit.

One of the submission requirements that we added a few months ago is: If the paper to be submitted is part of a larger research agenda (e.g., other related papers under review or book manuscripts in development) these details should be identified in the “Author Comments” text box during the manuscript submission process. We added this requirement after we had several reviewers, on different manuscripts, question the original contribution of the papers they were reviewing, as they seemed trivially different from other papers associated with a bigger project. Editors (thank you, John Ishiyama) sometimes refer to this as “salami slicing,” with the question being: how thin a slice of the big project can stand as its own independent, substantial contribution? Another reason for asking authors to report on bigger, related projects has to do with how these “bigger projects,” if involving a large group of scholars in a subfield who are not authors, might compromise the peer review process. Providing these details, as well as a comprehensive list of co-authors of all authors of the manuscript being submitted, is incredibly helpful as editors seek to identify appropriate reviewers—including those who might have conflicts of interest with the authors, or those who may base their review on who the author is, rather than the quality of the work.

As a testament to the serious and careful work our reviewers do, over the past few months, we have had to respond to problems with a number of submitted manuscripts that reviewers have suggested violate AJPS’s peer review principles. One reviewer identified a paper that had previously been declined, as he or she had already reviewed it once. Some, but not all, authors have communicated directly with us, asking whether, with substantial revisions to theory, data, and presentation, we would allow a (previously declined) paper to be reviewed as a new manuscript submission. Usually these revised manuscripts do not clear the bar as new submissions. In some senses, if you have to ask, you probably are not going to clear that bar. But we applaud these authors for taking this issue seriously, and communicating with us directly. That is the appropriate, and ethical, way to handle the question.

We’ve had similar problems with manuscripts that include text that has been previously published in another (often specialized subfield or non-political science) journal. Reasonable people, I suppose, might disagree about the “seriousness” or ethics of using paragraphs that have been published elsewhere in a paper under review at APJS (or elsewhere). The usual response is: How many ways are there to describe a variable, or a data set, or a frequency distribution? To avoid a violation of the “letter of the law” authors sometimes revert to undergraduate approaches to avoiding plagiarism, by changing a word here or there, or substituting different adjectives in every other sentence. The more paragraphs, of course, the closer the issues of “text recycling” and “self-plagiarism” come into play.

This sloppiness or laziness, however, pales in contrast to the more egregious violations of shared text between submitted and previously published papers that we have had to deal with. Sometimes we have read the same causal story, or saw analytical approaches augmented with one more variable added to a model, or a different measure used to test a series of the same hypotheses, or three more countries or ten more years added to the data set. At which point we had to determine whether the manuscript violates journal policies, or professional publishing standards.

When faced with these issues, we have followed the recommendations of the Committee on Publishing Ethics and directly contacted authors for responses to the issues we raise. I realize that junior faculty (especially) are under incredible pressure to produce more and better research in a limited pre-tenure period, and; I recognize that (a handful of?) more senior faculty may have some incentives for padding the c.v. with additional publications for very different reasons.

While there might be grey areas, I admit to having little sympathy for authors “forgetting” to cite their own work; using “author anonymity” as an excuse for not citing relevant work; or cutting and pasting text from one paper to another. This is not to say that the issues are simple, or that the appropriate editorial response is obvious. But it is discouraging to have to spend editorial time on issues such as these. And as a discipline, we can do better, by explicitly teaching our students, and holding colleagues accountable, to principles of openness, honesty, and integrity. Read the guidelines. Do the work. Write well. Identify issues before you submit. And don’t try to slide by.

The discipline—its scholarship, publishing outlets, its editorial operations, and professional standards—has certainly changed a lot, and in many good ways since the last time I edited. What has not changed is the critical importance of expecting our students and colleagues to respect shared ethical principles. Our editorial team has made some of those issues more explicit in the submission process, asking about editorial conflicts of interest, IRB approvals, and potential reviewer conflicts of interest. While this requires more work of our authors, we think it is work that is well worth the effort, and we thank our authors and reviewers for helping us maintain the highest of professional standards at the AJPS.

Celebrating Verification, Replication, and Qualitative Research Methods at the AJPS

By Jan Leighley, AJPS Interim Lead Editor

I’ve always recommended to junior faculty that they celebrate each step along the way toward publication: Data collection and analysis—done! Rough draft—done! Final draft—done! Paper submitted for review—done! Revisions in response to first rejection—done! Paper submitted for review a second time—done! In that spirit, I’d like to celebrate one of AJPS’s “firsts” today: the first verification, replication, and publication of a paper using qualitative research methods, “The Disclosure Dilemma: Nuclear Intelligence and International Organizations (https://doi.org/10.1111/ajps.12426)” by Allison Carnegie and Austin Carson.

The Disclosure Dilemma: Nuclear Intelligence and International Organizations Allison Carnegie Austin Carson

As with many academic accomplishments, it takes a village—or at least a notable gaggle—to make good things happen. The distant origins of the AJPS replication/verification policy were in Gary King’s 1995 “Replication, Replication” essay, as well as the vigorous efforts of Colin Elman, Diana Kapiszewski, and Skip Lupia as part of the DA-RT initiative that began around 2010 (for more details, including others who were involved in these discussions, see https://www.dartstatement.org/events ), and many others in between, especially the editors of the Quarterly Journal of Political Science and Political Analysis. At some point, these journals (and perhaps others?) expected authors to post replication files, but where the files were posted, or if publication was contingent on posting such files, varied. They also continued the replication discussion that King’s (1995) essay began, as a broader group of political scientists (and editors) started to take notice (Elman, Kapiszewski and Lupia 2018).

In 2012, AJPS editor Rick Wilson required that replication files for all accepted papers be posted to the AJPS Dataverse. Then, in 2015, AJPS editor Bill Jacoby announced the new policy that all papers published in AJPS must first be verified prior to publication. He initially worked most closely with the late Tom Carsey (University of North Carolina; Odum Institute) to develop procedures for external replication of quantitative data analyses. Upon satisfaction of the replication requirement, the published article and associated AJPS Dataverse files are awarded “Open Practices” badges as established by the Center for Open Science. Since then, the staff of the Odum Institute and our authors have worked diligently to assure that each paper meets the highest of research standards; as of last week, we had awarded replication badges to 185 AJPS publications.

In 2016, Jacoby worked with Colin Elman (Syracuse University) and Diana Kapiszewski (Georgetown University), co-directors of the Qualitative Data Repository at Syracuse University, to develop more detailed verification guidelines appropriate for qualitative and multi-method research.  This revision of the original verification guidelines acknowledges the diversity of qualitative research traditions, clarifies differences in the verification process necessitated by the distinct features of quantitative and qualitative analyses, and different types of qualitative work. The policy also discusses confidentiality and human subjects protection in greater detail for both types of analysis.

But it is only in our next issue that we will be publishing our first paper (available online today in Early View with free access) that required verification for qualitative data analysis, “The Disclosure Dilemma: Nuclear Intelligence and International Organizations (https://doi.org/10.1111/ajps.12426)” by Allison Carnegie and Austin Carson.  I’m excited to see the AJPS move the discipline along in this important way! To celebrate our first verification of qualitative work, I’ve asked Allison and Austin to share a summary of their experience, which will be posted here in the next few weeks.

As part of the efforts of those named here (and those I’ve missed, with apologies), today the AJPS is well-known in academic publishing circles as taking the lead on replication/verification policies—so much so that in May, Sarah Brooks and I will be representing the AJPS at a roundtable on verification/replication policies at the annual meeting of the Consortium of Science Editors (CSE), an association of journal editors from the natural and medical sciences. AJPS will be the one and only social science journal represented at the meeting, where we will  discuss what we have learned, and how better to support authors in this process.

If you have experiences you wish to share about the establishment of the replication/verification policy, or questions you wish to raise, feel free to send them to us at ajps@mpsanet.org. And be sure to celebrate another first!

Cited in post:

King, Gary. 1995. “Replication, Replication.” PS: Political Science and Politics. 28:3, 444-452. https://doi.org/10.2307/420301

Elman, Colin, Diana Kapiszewski and Arthur Lupia. 2018. “Transparent Social Inquiry: Implications for Political Science.” Annual Review of Political Science 21, 29-47. https://doi.org/10.1146/annurev-polisci-091515-025429

Peer Review Week at AJPS: Better Late than Never (Just like Reviews!)

By the AJPS Editorial Team

Who knew that September 10-15 was “Peer Review Week,” a celebration held in honor of the essential role of peer review in the academy and scholarly publications? We didn’t, until the week-long celebration was nearly over. Nonetheless, we made it a memorable week on the AJPS reviewer front by updating the “reviewer guidelines” at www.ajps.org. Please take a quick look at those details before submitting your next review.

Here, we offer some additional thoughts about doing reviews. Doing reviews often feels like a thankless task, one that takes time away from the more pressing matters in academic life. But we are keenly aware that, without able and willing reviewers, the entire peer-review enterprise would collapse. While some of these comments reflect our take on reviews and reviewing, we suspect that they might be shared by other editors in the discipline.

Why should I review? Participating in the peer review of research articles contributes to your scholarly community. It is a way to keep abreast of new ideas and new approaches. It offers an opportunity to use the expertise you’ve developed, via years of study and authoring papers, to advance our collective knowledge. And if the collective, altruistic view of reviewing doesn’t strike a chord, then how about this: it’s a great way to learn something—about substance, method or writing, to name a few.

For whom should I review? Many of us receive more review requests than we could ever accept, and sometimes from journals that seem a bit outside of our usual purview. Review for the journals where you want to publish, and for the journals you regularly read. Reviewing for and publishing in more general journals allows you to have a voice in the leading scholarship in your field. And whatever the impact factor, reviewing for more specialized journals usually shifts the focus to greater in-depth material and may benefit your own work even more.

How many reviews should I do? People sometimes don’t have a sense of how often they should review. Often Editorial Board members are expected to review more than others; we try to not ask for reviews from others more than a couple times a year. But it varies a lot, both here at AJPS and at other journals, and we as a discipline don’t coordinate on counting up the number of reviews across journals (although Publons may help a tiny bit with this). Data on the peer review process suggests that the burden of reviewing is unevenly distributed. In fact, one reason reviewers decline our invitations is that our “ask” has arrived when the potential reviewer already has committed to reviewing multiple papers, and they can’t add ours to the list.

Some editors like to remind their colleagues that, if they are getting three reviews every time they submit a piece, then they need to be doing three times their number of journal submissions/re-submissions in a given year. At AJPS, we now ask anyone who submits an article to commit to doing two reviews during the coming year. We appreciate that review requests sometimes come at inopportune times (whether or not you have other review commitments in line ahead of AJPS). In other cases, potential reviewers do not feel as expert as they might like. We (and most) editors appreciate why reviewers may need to decline in these situations. But every editor loves it when a declined reviewer invitation is accompanied by suggestions of potential alternate reviewers. Our reviewer database is a work in progress, and we certainly do not know people in every nook and cranny of the profession. Adding new reviewers to our database is critical to the editorial process, so making reviewer recommendations is incredibly helpful. You might also think of making recommendations as an opportunity to share the love with your friends, colleagues, and co-authors!

Why was I invited to review? The peer review process is one that is premised on expertise: you are invited to review because the editors believe you have the expertise to evaluate the argument, claims, and methods used in the paper. That said, you may not be an expert on every aspect, and do not need to address each of these aspects. You can focus your comments on the issues you believe to be most important as a basis for making a recommendation. Often editors will invite reviewers who have published research on similar—but not the same exact—questions to provide some diversity across reviewers. This is especially true at less specialized journals, where editors might want to know what scholars in the same subfield—but with different research expertise—think about the paper’s argument, importance, or potential (broader) impact.

What if I know who the author is? Convey this information to the editorial office immediately. It is helpful to explain for how long, and in what capacity, you have known the author and how you know the paper. Some editors will immediately release you from the review invitation (and appreciate being able to do this); others will judge whether the circumstances are sufficiently compromising that they need to find another reviewer. Or the editors may ask the invited reviewer to complete the review if they feel that they can offer an unbiased, serious review independent of knowing the author’s identity.

What if I’ve reviewed the manuscript for another journal? Convey this information to the editorial office as well. Many editors will leave the decision up to the reviewer; some will release you from the review. The basic principle here is that the reviewer and editor should be in agreement as to whether doing the review is the right thing.

What should I include in the review? Many journals will describe what they want in a review when they invite you, or post these details on the journal’s website. Oftentimes this comes in a series of questions that might structure the review: what is the paper’s theoretical motivation? Its theoretical or empirical contribution? Is the method appropriate to question? How persuasive is the empirical evidence? Is the author’s interpretation of the evidence accurate? Is it appropriate? What are the paper’s strength and weaknesses? Can the weaknesses be addressed? While providing a summary of the paper (from a few sentences to a lengthier paragraph) can help set the stage for the review, the more important details are your answers to the questions posed by the editors.

In addition, if you are really impressed by a paper, please tell us why. Sometimes reviewers are quicker to criticize than to praise. While we understand (and often share) that urge, keep in mind that we are seeking reasons to accept or advance a piece, as well as reasons to decline it. If you love a manuscript, don’t be afraid to advocate for it. You may find this piece, written by Sara Mitchell (an AJPS Editorial Board member) to be useful.

Will the editor follow my recommendation? Sometimes they will; sometimes they won’t. At most journals, reviews are advisory to the editor. And it is important to remember that, regardless of the time invested in the review, the editor has more information than what any single reviewer has. She knows who the members of the review panel are and what their review histories are (some reviewers inflate their grades, whereas others do not!). She also often receives private “to the editor” comments from reviewers, which sometimes emphasize certain points made in the review. Moreover, the editor sees reviews for dozens or even hundreds of manuscripts per year. While each manuscript decision is dependent on the reviews that are submitted, the editors have some experience to assess what good and bad reviews look like; and what a reasonable amount of work for a revision would be.

With all that said… We now declare this week as Peer Review Week at AJPS! So send in those reviews (early, late or on deadline). And take a few minutes to update your contact details and research interests in our Editorial Manager database, so that we know how (and for what) to invite you to review. If you do not have an existing Editorial Manager profile and want to review, begin one—and send along your C.V. to the editorial office at ajps@mpsanet.org so we can learn more.

ICYMI: New People, New Policies

By Jan Leighley, AJPS Interim Lead Editor

Summer breaks are never long enough, eh? It’s back to “normal” business as of earlier today, when the Editorial Manager submissions portal re-opened for new business. During our break, I posted some details about the updated submissions guidelines, so be sure to take a careful look at some of the changes that we have made (more on those below).

The other change that we made over break was shifting the editorial staff responsibilities from Michigan State University to American University. The transition for staff responsibilities reflected a fundamental change in the editorial structure—with associate editors identifying reviewers and drafting decision letters based on their readings of the manuscript and the reviews, and letters that I send out after reviewing the manuscript and its reviews. One of the biggest changes has been having Marty Jordan, previously the Managing Editor, shift to a new role as Production Editor. In this capacity, Marty will deal with all matters relating to production as well as replication. At AU, Julia Salvatore has taken over the responsibilities of the ever-important AJPS inbox, managing the stream of incoming emails, as well as other matters relating to the editorial procedures and office management. So when you email AJPS@mpsanet.org, you’ll most likely be sending Julia an email, and she will ably respond or move it along to whoever can respond. We also have new staff at AU who will assist with technical checks. This includes Ryan Detamble, a Ph.D. student in the Department of Government.

I wrote last week about changes in the submissions process. We will do our best—with new staff as well as updated guidelines—to move manuscripts through technical checks as fast as possible, but need your help to do so. Please send any questions about new submissions—as well as manuscripts already under review—to AJPS@mpsanet.org, and we will respond as quickly as possible.

A few notes about those updated submission guidelines. In my post last week, I noted the changes but didn’t spend much time explaining the rationale for those changes, and that is: to improve the peer review process. Providing the names of co-authors from the past five years for every author on submitted papers will allow us to avoid “obvious” conflicts of interest that are both difficult and time-consuming for either editors or staff to identify. Additional details regarding author anonymity—and the need to disclose whether the manuscript being submitted is part of a larger project—is essential to allowing the editors (if not reviewers) be able to assess the independent contribution of each manuscript we review. (If you don’t know what “salami-slicing” is in the editing/academic publishing world, maybe ask about that at APSA…) We don’t want to publish papers that are re-treads, or that make marginal advances—which means that the AJPS editorial team and reviewers need to be able to evaluate every manuscript in light of what it contributes, aside from related book projects, other publications—and even related papers under review. We encourage you to follow the updated guidelines  as closely as possible—but also to please email with any questions or concerns you have about related work before you submit your manuscript.

Finally, we announced last week that generally we expect supplemental information (SI) files to be no longer than 20 pages. Our primary goal here is to enhance the use of the SI information as part of the review process—which means for some authors, more focus and intentionality in what and how much is provided as a supplemental file to allow AJPS to publish papers that stand on their own.  We plan to discuss these policies with editorial board members at APSA and will consider additional updates to the guidelines as needed.  Until then, we ask that authors work toward limiting SI files to 20 pages, and send any questions about this to AJPS@mpsanet.org.

With the Editorial Manager portal open again—and the deadline for APSA papers approaching (or past?)—it will be a busy week. I hope it’s a good one on all fronts, as the summer draws to a close.

 

 

Some Details about New AJPS Submission Requirements

By Jan Leighley, AJPS Interim Lead Editor

I’m a firm believer in celebrating every step along the way when the goal is to publish research at a peer-reviewed journal: when the manuscript is “done”; when it’s submitted; when there is a decision (whether positive or not, and whether final or not); when page proofs arrive; when page proofs are completed; when the manuscript is published online; and when the print version arrives. As with many aspects of academic life, publishing always takes longer, and is always more complicated, than one would like.

Celebrating the submission of a manuscript to the AJPS has never been more important—in part because of its high impact ranking, but also (on a more practical level) we are now asking authors to do a bit more as they submit manuscripts. Effective immediately, we have added several new details to the submission process. All authors who plan to submit a manuscript should take a look at the updated submission guidelines we have posted online before finishing the manuscript for submission.

One of the key changes we have made is to limit the length of Supporting Information documents to 20 pages or less. While it is true that online “space” (where Supporting Information documents are published) is unlimited, the time and attention of editors and reviewers are not. We hope this page limit results in more thoughtful and focused decisions about what additional details are provided—but also helps to produce papers that can “stand alone,” without a seeming endless dumping of additional details and analysis into the ever-present “Supporting Information” file.

The new manuscript guidelines also provide more details about how we expect author anonymity to be maintained in the manuscript. Here, we also now ask corresponding authors to provide details about other related papers under review, or book manuscripts in development. We hope this clarifies what information authors are expected to provide to allow reviewers to assess the manuscript’s theoretical and empirical contributions, independent of other related work. Whether cited in the submitted paper or not, if other papers “in progress” overlap with the AJPS submission, we want to know about them.

Related, we now ask corresponding authors to provide the names of co-authors from the past five years for every author of the submitted manuscript, along with identifying each author’s dissertation chair. This allows us to better avoid conflicts-of-interest as we invite reviewers on manuscripts—in a growing, increasingly complex discipline that now reaches across continents. Though we are each experts in our respective subfields (and more, at times), we simply cannot know all the professional and personal connections that might compromise the peer-review process.

And, finally, we have implemented procedures to implement the MPSA council policy regarding editorial conflicts of interest. All authors should review that policy so that the manuscript’s corresponding author is able to identify any potential conflicts of interest with the current editorial team. As dictated by council policy, the MPSA Publishing Ethics Committee (chaired by Sarah Binder) provides guidance on some cases where an alternative editorial process is required.

Thanks to all of our authors for sending their best work to us—and helping us to provide the most efficient and rigorous review process possible.  As always, send questions about the submission and review process to ajps@mpsanet.org, and best wishes for the start of the new semester.

 

What AJPS is Publishing When, and Staff Transitions

By Jan Leighley, AJPS Interim Lead Editor

After a whirlwind editorial transition, we are looking forward to a reprieve from the daily submissions that require our attention. We are taking a two-week break from the workload (instead of the usual month-long hiatus), which will allow us to catch up on decisions, pester reviewers and tend to the usual pre-APSA and Fall Semester preparations.

Of course, the two-week break will not slow down the publication of accepted papers; volume 62:3 was just released in July. All of the papers in 62:3 were shepherded through review and publication by Bill Jacoby and his staff. The same will hold true for nearly all the papers in 62:4—to be published in October—where we have had the privilege of moving some of the papers into the final decision or production stages.

Substantively, I have been surprised by the large number of comparative politics submissions we are receiving—though a careful review of 62:3 and 62:4 should have been an obvious sign that, contrary to some rumors, the AJPS publishes far more than “just” American politics. Related, I have received emails asking whether the AJPS publishes qualitative research papers, as rumor has it the AJPS does not. Just to clarify: come August 20, the AJPS will be open for submissions again, and we welcome submissions across all subfields, and all methodological approaches. As a general journal, we seek to publish the best work across the discipline, papers that offer theoretical, methodological and empirical advances.

I am also using the time over break to make the staff transition from Michigan State to American University, which we couldn’t do during the abbreviated transition period in the spring. I am grateful to the MPSA, Michigan State and, most importantly, the exceptional staff that continued to work for us as we sorted out editorial issues. Marty Jordan, who was Managing Editor for the past year, continued to play this key role for the past few months, while Nathaniel C. Smith and Jessica A. Schoenhoerr also continued in their roles as Editorial Assistants. We simply could not have caught up from the month-long hiatus, and starting to review manuscripts (old and new) without each of them continuing to do the fine work they had done over the past year.

Aside from manuscript decisions, though, who is doing what here in the editorial office will change. I am pleased to introduce Julia Salvatore as our new Editorial Administrative Assistant. Julia will be, most importantly, managing the ajps@mpsanet.org inbox, responding to author, reviewer and associate editor queries, among other office management tasks. Marty Jordan will be shifting to Production Editor, handling all post-decision matters associated with the publication of accepted papers, including replication and post-production communications.

I thank you for your patience as we shift Marty, Julia and others into new and different responsibilities and tasks. Hopefully the slower pace of the editorial office during the break will minimize any disruptions due to the staff transition. As always, send questions or concerns to ajps@mpsanet.org.

With Thanks to Our Reviewers and Editorial Board Members

One of the ways that peer-reviewed journals advance the frontiers of scholarly knowledge in political science is by highlighting the best—theoretical and empirical—work of political scientists. Another way is in providing academics with the opportunity to engage in a “virtual colloquium” when they submit papers for review. Though anonymous, reviewers provide an invaluable service by engaging with the ideas, arguments and evidence presented in our manuscripts.

As an editorial team, we are impressed daily with the quality of the reviews we receive. We value the time and contributions of our reviewers, as we cannot make the right decisions on manuscripts without relying on the expertise of our reviewers’ comments. We publish a list of our reviewers on the journal’s website every year, but this is hardly sufficient recognition of our appreciation—or the reviewers’ contributions. So, to those of you who take time out of your busy work schedules and respond positively to our invitations to review manuscripts, I would like to say: thank you.

I would also like to thank those of you who have agreed to serve on our editorial board this year.  This distinguished group of scholars—whom I’ve informed will likely be asked to do more reviews, in less time, than others who review for us—will also advise us on editorial policies, and provide us critical feedback on a variety of issues relevant to the peer-review process at AJPS. We will have our first editorial board meeting in Boston during the APSA meeting in late August/early September, and we look forward to discussing a wide range of issues then. Be sure to watch for policy updates on the AJPS website that result from that meeting.

Until then, we will continue to provide the most efficient and relevant reviews of the papers that are submitted to AJPS. We do plan to close the AJPS portal to new submissions from August 4 through August 19. Since this isn’t the typical month-long hiatus, we needed to call it something else. We thought about calling this our “August recess,” following the time-honored tradition observed down the street from AU, the Congressional Recess. Some of us thought recess sounded like elementary school, and suggested instead that it be our “August break.” But since we’re only closing to submissions and otherwise making decisions, calling it a break made it sound like more of a vacay than is the case. Finally, we considered taking an “August holiday,” but, well, that just sounds silly. So, bottom line: we’re working, the first two weeks of August, just not accepting new submissions.

Hope your recess/break/vacay/holiday this summer is a good one!

Jan Leighley, Interim Editor

 

The American Journal of Political Science (AJPS) is the flagship journal of the Midwest Political Science Association and is published by Wiley.