Changes to AJPS Conflict of Interest Policy

From: Dan Reiter and Adam Berinsky, editors-in-chief

This post describes a policy change for Conflict of Interest (COI) policies pertaining to potential reviewers of manuscripts submitted to AJPS.

COI policies for AJPS are described on this webpage (https://ajps.org/conflict-of-interest-policy/), and currently include the following text:

“Nature of Conflicts Relevant to this Policy. Not all conflicts of interest are prohibited or harmful to the MPSA. The association recognizes that our association, disciplines, and scholarly communities are relatively small, with potentially complex (collegial or competitive) relationships.  However, the following professional or personal relationships between authors and editors are conflicts of interest that are prohibited:

    • current or former dissertation committee chair or committee member (ever)
    • current colleagues at the same institution
    • current professional research, teaching or funding collaborators
    • current or former spouses or partners”

Regarding former coauthors, AJPS declares coauthors from the last five years as having a COI.

This policy is beginning to generate difficulties for the review process. Scholars are beginning to write in ever-larger teams of coauthors.  In a recent extreme case, a coauthor on a submitted manuscript is a coauthor on another paper with 161 coauthors.  More commonly, we are now increasingly receiving manuscripts which list dozens of scholars who are ineligible to serve as reviewers because of current COI policies.  This problem is exacerbated when a manuscript submitted to AJPS itself has multiple coauthors, because then all of those coauthors themselves have many past coauthors.  All of these coauthors are currently listed as having COI, which then is narrowing the potential reviewer pool for a growing number of manuscripts.

The narrowing of the potential reviewer pool for a growing number of manuscripts slows down and reduces the quality of the review process.  It slows down the process because it takes longer for the AJPS editorial staff to assemble a panel of three reviewers that do not have COIs with the manuscript authors.  It reduces the quality of the review process because eliminating a larger set of potential coauthors means the editorial staff needs to reach out to scholars who are less likely to have familiarity with topic of the manuscript, and less likely to have strong scholarly reputations.

However, the general principle of avoiding COIs in the review process remains important.  We must have reviewers that can evaluate a submitted manuscript objectively, without being affected by personal or professional ties that might distort the neutrality of their review.

With these considerations in mind, we have made the following change to current AJPS COI policies.  To the above paragraph, we have added the following sentence:

“Potential reviewers who have coauthored with one of the manuscript authors within the past five years are considered to have a Conflict of Interest and are not eligible to review. An exception is if the coauthorship did not create a close professional relationship between the potential reviewer and the manuscript author.  An example of coauthorship not creating a close professional relationship would be if the coauthoring team was large, the manuscript author and potential reviewer were not lead scholars on the project, and the manuscript author and potential reviewer had very little or no direct communication with each other.  Note that even if coauthorship did not create a close professional relationship between the manuscript and potential reviewer, other factors may have created a COI, such as if the potential reviewer served as the dissertation advisor of the manuscript author.  We leave to the discretion of the manuscript authors to determine when coauthorship did not create a close professional relationship.”

We believe that this approach maintains an appropriate balance between competing interests.  This step should help reduce the number of scholars excluded from consideration as reviewers because of COI issues.  Further, this policy will not significantly undermine the objectivity of the review process.  In general, when there are large groups of coauthors on a manuscript, it is often the case that many of the coauthors have only loose ties with each other.  They may work in different universities or even different countries, and may have not met or directly communicated with each other.  Within these large groups, there may be a small number of project leaders and then a team of far-flung researchers, all of whom sometimes get classified as coauthors with COIs under the current rule.  We believe that within any large set of coauthors, for any one single coauthor, there could be a smaller set of a coauthors with close professional relationships that should be classified as COIs. Other coauthors do not need to be classified as COIs.  Note that if within the rest of the coauthor team there are individuals with whom the coauthor has other kinds of ties, such as spouses, dissertation advisers, or colleagues at the same institution, they will remain classified as having COIs independently of the rule change.

AJPS AI Policy

From: Dan Reiter and Adam Berinsky, editors-in-chief

We have been discussing amongst ourselves how AJPS should handle the issue of artificial intelligence (AI) in the creation and review of journal submissions. These are important issues we feel we must address. In creating these policies, we had several conversations with other Political Science journal editors. Though there is variance across policies and preferences of journal editors, a common thread is an emphasis on the importance of transparency, of requiring authors and reviewers to disclose if and how AI was used.  There was concern over the lack of reliable tools to detect the use of AI by an author or reviewer, suggesting an emphasis on requiring author disclosure rather than attempting active policing.

With these concerns in mind, we have crafted the following policies regarding the use of AI at AJPS, for both authors and reviewers.

For Authors

American Journal Political Science requires that manuscript authors must disclose the use of any artificial intelligence tools for work on any element of a submitted manuscript, or any research conducted by the authors to produce the manuscript, for tasks such as copyediting, drafting pre-analysis plans, writing software code, producing mathematical proofs, and others.  This disclosure should be made in the text or footnotes of the manuscript. The text of this disclosure statement must be included in the Author questionnaire at the time of submission.  It is the responsibility of the authors to ensure the validity of any elements that were produced by artificial intelligence.  It is also the responsibility of the authors to ensure that any use of artificial intelligence does not violate ethical guidelines, such as treatment of human subjects.  Authors should avoid using artificial intelligence to write the manuscript or substantial elements of the manuscript, such as the literature review.  Authors must also comply with Wiley’s AI guidelines for researchers.

For Reviewers

“Reviewers may use AI as part of their normal workflow (e.g., finding related papers, copyediting), but reviewers cannot use AI to directly evaluate a paper or write any part of a reviewer report. Reviewers should also comply with Wiley’s AI guidelines for researchers.

Updates Regarding Supplemental Information on Manuscript Submissions

From: Dan Reiter and Adam Berinsky, editors-in-chief

AJPS announces a slightly revised policy for manuscript submissions.  The page limit on supplemental materials has been increased to 25 pages, and we are also now permitting the submission of pre-registered analysis plans (PAPs).  There is no page limit on PAP submissions, and PAP length does not count against the supplemental materials page limit, or the 10,000 word limit for the manuscript.  We hope that this will help authors present their research more fully and transparently to reviewers.

Editorial Principles at AJPS

From: Dan Reiter and Adam Berinsky, editors-in-chief

General Statement of Principles

AJPS has for decades been recognized as one of the top two or three journals in all of political science.  The central goal of the new editorial team is to maintain and, if possible, further strengthen this reputation.

AJPS endeavors to publish the most significant research in political science.  The significance of a submitted manuscript is generally determined by three different factors:

  • Importance of scholarly question for political science discipline
  • Theoretical innovation and contribution
  • Empirical contribution

We offer the following observations on how to think about these three criteria of significance, and other pertinent issues.

  1. Each article may excel more on one dimension, but we would expect most articles to hold high standards on multiple dimensions.
  2. The importance of the scholarly question is a gateway criterion. Manuscripts that tackle relatively minor scholarly questions should be redirected to more appropriate journals. Further, manuscripts addressing non-political questions, such as geographic determinants of economic growth, should also be redirected to other journals.
  3. Most manuscripts other than methodology manuscripts should have at least some theoretical content, in the sense of proposing, presenting or synthesizing broader theoretical assumptions to motivate hypotheses and empirical analysis. Though normative political theory manuscripts of course differ in structure and aspiration.
  4. Most manuscripts with empirical contributions, but without theoretical innovation, are less likely to find a home at AJPS. This is true even for manuscripts presenting new empirical data. Manuscripts that offer new empirical tests of standing theories and hypotheses are widely published in the growing set of subfield and specialty journals.
  5. A leading trend in recent years has been growing concern with causal inference, and constructing empirical tests that convincingly demonstrate causal processes outlined in theories. AJPS can and should continue to demand that authors do their best to address issues of causal inference, using the most advanced available methods. In addition, authors should in their manuscripts give an honest accounting of what findings can be interpreted causally and under what assumptions such findings hold when making such claims. At the same time, it is important to remember that an exclusive focus on causal inference risks narrowing the field, in the sense that inevitably some areas of great scholarly significance experience limits regarding the degree to which causal inference can be established within plausible empirical designs. Innovative and high-quality descriptive work can find a home at AJPS as well. Thus, AJPS welcomes work dealing with questions of great political significance, including papers that address causal questions and those providing new descriptive or predictive understandings.
  6. AJPS welcomes formal theory manuscripts that also contain careful empirical tests.  However, inclusion of careful empirics alongside a formal model is not necessary or always possible, especially given word count constraints.  That said, it is important for all formal theory papers to retain some connection to empirics, even if only the use of historical or policy illustrations, or discussion of how the theory provides new insight into existing empirical work.
  7. Continuing past policies, AJPS will decline manuscripts that are purely focused on historical cases or contemporary policy debates without connection to theory or method. AJPS also declines papers that are primarily pedagogical, surveys of existing results (meta-analyses notwithstanding) and manuscripts that are not building on contemporary political science scholarship (though recognizing that innovation outside of existing research paradigms is something to strive for).
  8. It is the burden of the author to make their manuscript clear to reviewers. If reviewers are unable to understand the central components of a manuscript’s claims, then that is the fault of the authors, not the reviewers. Manuscripts must strive to be clear to reviewers, all of whom have scholarly backgrounds.
  9. Pre-registration of an analysis plan is committing to analytic steps without advance knowledge of the research outcomes. Pre-registration is neither necessary nor sufficient for good research. But pre-analysis plans (PAPs) help conducting well-considered research in a transparent way. Pre-registration reminds us to carefully think through the research question, our expectations and all the minor and major decisions that may influence the research outcomes before our data is collected and our analyses are done. AJPS does not require authors to submit PAPs for any studies, but encourages authors to consider such plans, when appropriate. For instance, many reviewers of experimental work ask to see PAPs.
  10. Authors should describe in detail their sampling procedure. In the case of survey and experimental work, this description should include explicit statements of sample design and the treatment of non-response. The AJPS does not have any requirements for specific sampling procedures, but authors should be prepared to justify and defend their sampling choices. During the review process any criticism of samples must be based on a serious discussion of why the sample is not appropriate for the given analysis conducted by the author.
  11. For the first time, AJPS will publish shorter essays, called “Research Notes.” Research notes will have a 4,000 word limit. The title of each Research Note will start with the words, “Research Note,” as in, “Research Note: Populism and Violence Against Immigrants,” to help distinguish research notes from full-length articles.  Research notes at AJPS will be confined to methodology papers (including methodology papers in normative political theory) and meta-analyses.  Research notes will not be essays that primarily present new data, or that offer replications of previous studies without significant theoretical or research design innovations.

It Takes a Submission: Gendered Patterns in the Pages of AJPS

Kathleen Dolan and Jennifer L. Lawless

When we became editors of the American Journal of Political Science on June 1, 2019, we stated that one of our goals was to understand the patterns of submission and publication by authors from underrepresented groups. We begin that examination by presenting data on submission and publication rates of women and men. We focus on manuscripts submitted to the journal between January 1, 2017 and October 31, 2019. This time period spans three different editors/editorial teams: Bill Jacoby served as editor from January 2017 until April 2018; Jan Leighley from April 2018 through May 2019; and we have been co-editors since June 2019. Although our editorial team was in place for only the last five months of this period, we wanted to examine a long enough time span to get a good sense of any gendered patterns that exist in the pages of AJPS.

We view these data as contributing to recent conversations about the representation of women as authors and as cited authorities in political science journals. Michelle Dion and Sarah Mitchell, for example, recently published a piece in PS about the citation gap in political science articles.[1] They compare the gender composition of membership in several APSA organized sections with the gender balance in citations published by each section’s official journal. Dawn Teele and Kathleen Thelen document a lower percentage of female authors in 10 political science journals than women’s share of the overall profession.[2]

We take a different approach. Because we have AJPS submission data, we can examine the link between gender gaps in submission rates and subsequent publication rates. After all, women and men can be under- or over-represented in the pool of published articles only in proportion to their presence in the pool of submitted manuscripts. We believe that attention to the appropriate denominator offers a clearer picture of authorship patterns.

Submissions
During the period under examination, 4,916 authors submitted manuscripts and received final decisions from AJPS. Women accounted for 1,210 (or 25%) of the submitting authors.

At the manuscript level, the gender disparity was less substantial. Of the 2,672 manuscripts on which an editor issued a final decision, 945 (or 35%) had at least one female author.

The lion’s share of the manuscripts that included a female author, however, also included at least one male co-author (see Figure 1). Indeed, we processed four and half times as many manuscripts written only a man or men (65%) as we did those authored only by a woman or women (14%).

Homing in on the 1,238 solo-authored manuscripts, 962 came from men. Women, in other words, accounted for just 22% of the solo-authored submissions we received.

Figure 1. Composition of Authors for Manuscripts Submitted to AJPS
Figure 1
Notes: Bars represent the percentage of manuscripts that fall into each category. The analysis is based on the 2,672 manuscript for which we issued a final decision (accept or decline) from January 2017 – October 2019.

Decisions
Whereas striking gender disparities emerge during the submission process, we find no significant gender differences when it comes to manuscript decisions. During this time period, we accepted roughly 6% of submitted manuscripts. Those submissions included a total of 307 authors, 75 of whom were women. Thus, women comprised 24% of accepted authors – this is statistically indistinguishable from the 25% of female submitting authors.[3] Notice, too, that our rates of acceptance are consistent across the composition of authors. Regardless of how many women or men author a piece, only about 6% are accepted for publication. None of the differences across categories in Figure 2 is statistically significant.

Given the comparable acceptance rates across author composition, it’s no surprise that the percentage of female authors on our pages is roughly the same as the proportion of manuscripts submitted that included at least one female author (35%). Of course, given that most of the manuscripts submitted by women also include at least one male co-author, 84% of the articles published during this time had at least one male author. 

Figure 2. Manuscript Acceptance Rates at AJPS, by Composition of Authors
Figure 2Notes: Bars represent the percentage of accepted manuscripts that fall into each category. The analysis is based on the 2,672 manuscript for which we issued a final decision (accept or decline) from January 2017 – October 2019.

A COVID-19 Caveat
Over the course of the last several weeks, submissions at AJPS have picked up substantially (as compared to the same month last year). It’s impossible to know whether to attribute the uptick to MPSA conference papers that were no longer awaiting feedback, more time at home for authors, different teaching commitments, etc. But we examined the 108 submitted manuscripts we received from March 15th through April 19th to assess whether the patterns from the larger data set have been exacerbated amid COVID-19. After all, women are still more likely than men – even among high-level professionals – to shoulder the majority of the household labor and childcare or elder care responsibilities. It wouldn’t be surprising if the gender gap in manuscript submissions grew during this time.

The data reveal that it hasn’t. The 108 manuscripts we processed in this month-long period included 54 female and 108 male authors. So, women comprised 33% of submitting authors, which is actually somewhat higher than usual (remember that women comprised 25% of the authors in the 2017 – 2019 data set).

At the manuscript level, 41 of the 108 papers had at least one female author. That’s 38% of the total, which is again a slightly greater share than the 35% of manuscripts with at least one female author in the larger data set.

This doesn’t mean that Covid-19 hasn’t taken a toll on female authors, though. Women submitted only 8 of the 46 solo-authored papers during this time. Their share of 17% is down from 22% in the larger data set. As a percentage change, that’s substantial. Even if women’s overall submission rates are up, they seem to have less time to submit their own work than men do amid the crisis.

Conclusions
In examining the gendered patterns in submission and publication at AJPS over the past three years, we see two different realities. In terms of “supply,” there is a large disparity. Women constitute just one-quarter of submitting authors, and their names appear on only one-third of submitted manuscripts. But when it comes to “demand,” there is no evidence of clear bias in the review or publication process. Women’s ratios on the printed pages are indistinguishable from their ratios in the submission pool. As long as it’s the case that women are less likely than men to submit manuscripts to AJPS, the gender disparities in publication rates will remain.

Given these findings, and the work we do, we would be remiss not to draw a comparison to the political arena. We’ve known for decades now that when women run for office, they do as well as men. They win at equal rates, raise as much money, and even garner similar media coverage. Yet women remain significantly under-represented in U.S. political institutions. Why? Because they look at a political arena where they are significantly under-represented and assume (rationally) that widespread bias and systematic discrimination is keeping them out. Because they think that in order to be qualified to run for office, they need to be twice as good to get half as far. Because they’re less likely than men to receive encouragement to throw their hats into the ring.

But we also know that when women are encouraged to run for office, they’re more likely to think they’re qualified and they’re more likely to give it a shot.

So as a discipline, it’s incumbent upon us to encourage female scholars to submit their work to AJPS and other top journals. It’s our responsibility to let them know that their work is just as competent and just as important as that of their male colleagues. We are not so naïve as to believe that encouragement is all it takes to close the gender gap in rates of submission. That women are still not similarly situated with men in important resources (tenure track jobs, research support, family obligations) poses obstacles that encouragement alone cannot surmount. But while the discipline continues to address these resource gaps, we can change the face of tables of contents by calling attention to the myths about women not succeeding when they submit their work.

[1] Dion, Michelle L. and Sara M. Mitchell. 2020. “How Many Citations to Women Is ‘Enough?’ Estimates of Gender Representation in Political Science.” PS: Political Science & Politics 53(1):107-13.

[2] Teele, Dawn Langan and Kathleen Thelen. 2017. “Gender in the Journals: Publication Patterns in Political Science.” PS: Political Science & Politics 50(2):433-47.

[3] These results are consistent with a 2018 symposium on gender in the American Political Science Association’s journals. See “Gender in the Journals, Continued: Evidence from Five Political Science Journals.” PS: Political Science & Politics 51(4).

AJPS Editor’s Blog

Covid-19 has thrown everything off kilter, even academic journals.  Here at AJPS, we have seen two patterns in the past two or three weeks – a 27 percent increase in manuscript submissions AND a 54 percent decline in review invitations accepted – over the same period last year.  While AJPS reviewers have terrific turnaround time, we realize that people may be delayed in returning reviews this semester. So these figures suggest that manuscript processing might take a bit longer from start to finish for this “Covid-19 cohort.” As a result, we call on authors to exercise patience and gratitude for the colleagues doing this work.

AJPS Editor’s Blog

Covid-19 Update:

The AJPS continues to process manuscripts.  We understand that people have many things going on during this time of crisis, so please know that we are happy to be flexible with deadlines for reviews and manuscript revisions.  If you receive a request to review and can’t accept, we understand. If you can review, but need more than the usual time frame, just ask for an extension. We ask authors for patience and empathy during this time as we continue to work.

AJPS Editor’s Blog

December 1st marked our first six months as co-editors of AJPS. We want to thank our associate editors, board members, authors, and reviewers for a very smooth transition. Now that we have some experience under our belts, we thought we’d offer some thoughts about several topics that might be helpful to reviewers and authors alike. So, on the occasional Tuesday, we’ll post a short entry about some aspect of the journal submission, review, or publication process that we’ve had to address over the course of the last six months. While these issues are probably relevant to most journals, we only speak for ourselves and our expectations for AJPS.

Proposing and Opposing Reviewers

AJPS, like many other journals, gives authors an opportunity to suggest appropriate reviewers as well as identify scholars authors might like us to avoid. Throughout the course of the last few months, it’s become clear that a few dos and don’ts might be helpful as complete the proposing and opposing reviewers boxes in Editorial Manager.

Here are three tips when proposing reviewers:

  • Avoid common conflicts of interest – your department colleagues, recent co-authors, member of your dissertation committee, etc. are not appropriate reviewers. In fact, screening for those people is part of our technical check process.
  • Please don’t recommend people who have previously read and commented on the manuscript, whether as a conference discussant or as a more informal collegial favor. Whenever possible, we prefer people to come to a manuscript with a fresh set of eyes.
  • Identify subfield experts, experts in your particular methodology, or younger scholars who might not yet be a part of our reviewer database.

Opposing reviewers is a bit trickier. Without suggesting that academics can engage in petty personal squabbles or have territorial interests around subject areas, we understand that conflicts between scholars can color a reviewer’s assessment. If you believe that a likely reviewer is not well-suited to assess the manuscript objectively – perhaps you’ve had personal disputes, maybe the person was unprofessional at a conference on a panel, we could go on – then please provide a brief statement as to why we should avoid this scholar. Simply noting that the person comes at your question from a different perspective is insufficient. Indeed, these are the very people who should be reading and evaluating your work.

As editors, we take these suggestions into account, although we do not guarantee that we will follow any of the suggestions.  But providing us with specific reasons for whom you propose and whom you’d like us to avoid will help us best evaluate the request.

AJPS Editor’s Blog

December 1st marked our first six months as co-editors of AJPS. We want to thank our associate editors, board members, authors, and reviewers for a very smooth transition. Now that we have some experience under our belts, we thought we’d offer some thoughts about several topics that might be helpful to reviewers and authors alike. So on the occasional Tuesday, we’ll post a short entry about some aspect of the journal submission, review, or publication process that we’ve had to address over the course of the last six months. While these issues are probably relevant to most journals, we only speak for ourselves and our expectations for AJPS.

Passing Verification with Flying Colors

For nearly five years now, AJPS has verified quantitative manuscripts. And last year, we verified our first qualitative paper as well. As most of you know, this doesn’t mean that we replicate authors’ analyses. It “simply” means that we reproduce – using the authors’ data, code, and/or interview transcripts – the results presented in the paper. Notice that simply is in quotes. That’s because most papers don’t pass verification by the Odum Institute in the first round. Sometimes the data set won’t open. Sometimes the computing clusters where the authors conducted the analyses can’t “talk” to the cluster where the analyses are being verified. Sometimes the code includes typos or other minor errors. Sometimes it’s all of the above. Regardless of the problem, though, when a paper can’t be verified, it goes back to the author for revision and resubmission. Multiple rounds of back and forth are, unfortunately, not uncommon.

Given that authors now conduct increasingly complex analyses that regularly rely on multiple data sets, we figured that this would be a good time to offer three helpful hints that can make the verification process less cumbersome and more efficient:

  1. Verify your own analyses first. Before uploading your data set and code to Dataverse, run it and see if you get the exact same results as you report in the paper. Sounds obvious, right? Well, it turns out that most authors actually don’t do this. But those who do have reported a relatively seamless verification process – they’ve caught glitches, typos, and bugs. If you can perform this task on a different computer than you typically use, even better. That increases the probability that the code will run when Odum begins the verification.
  2. Keep the code as slim as possible. After a paper has been verified, if you make any changes that involve the verified results, then the whole thing must go through the process again. Let’s say you want to modify a figure’s title and source note when you receive the page proofs. Well, if you generated the title and note in Stata or R and it’s part of the code, then the whole paper needs to go back to Odum. To avoid unnecessary delays, try to remove from the code anything that’s not part of the actual results or formatting of the figure itself. If you can do it in Word, take it out of the code.
  3. Talk to the IT person at your institution. If you know that your analyses have caused you any trouble or required somewhat unusual accommodations – like a super-computer or major technical assistance merging data sets – then prepare a document that summarizes how you solved these problems and the specifications of the computing environment you used. That short memo will go a long way helping Odum determine the best way to verify the analyses without having to troubleshoot all of the issues you already confronted.

We know that the verification process is clunky. But taking these three steps will undoubtedly make it a lot smoother and much faster. It’ll also make it that much easier when you want to verify someone else’s results.

AJPS Editor’s Blog

December 1st marked our first six months as co-editors of AJPS. We want to thank our associate editors, board members, authors, and reviewers for a very smooth transition. Now that we have some experience under our belts, we thought we’d offer some thoughts about several topics that might be helpful to reviewers and authors alike. So, on occasional Tuesdays, we’ll post a short entry about some aspect of the journal submission, review, or publication process that we’ve had to address over the course of the last six months. While these issues are probably relevant to most journals, we only speak for ourselves and our expectations for AJPS.

Appendices, Supporting Information, Supplemental Materials, You Get the Picture

In the old days – like the early 2000s – most articles appeared exclusively in print. Authors struggled with word counts because submissions had to include all relevant material, including appendices. Online supplemental appendices now allow authors to focus the body of the text on telling the main story. Details about survey questions, experimental treatments, alternative model specifications, robustness checks, and additional analyses can be relegated to the appendix. The upside is that articles themselves can be shorter, crisper, and more straightforward, but readers can still find clarifying information in the appendix. The downside is that some authors have taken a “more is better” and “better safe than sorry” approach to appendix compilations. In our six months on the job, we have received 10,000-word manuscripts that are “supported” by 50, 75, even 100-page appendices. Most appendices aren’t this long, but almost every manuscript now comes with significant supplemental materials.

We understand why authors do this. Why not preempt any concern a reviewer might raise, provide every alternative specification possible to model, and share every detail about the research design and protocol? The problem is that while appendix space may seem “free” to authors, it comes with a substantial cost to reviewers, who are now often faced with a 10,000-word manuscript and an equally long or longer appendix. Anything that increases the burden on reviewers makes an overworked system even more precarious.

At AJPS, we limit supplemental appendices to 20 pages. We believe that this gives authors sufficient space to provide additional information that might not belong in the body of a manuscript but is still important to the paper’s central contribution. In enforcing this limit, we ask authors to think carefully about what they really need to include in an appendix verbatim versus what they can summarize. If you run three experiments with identical treatments, you only need to offer the script of the treatment once. If you’re providing alternative analyses, you don’t have to provide every model you ever ran or think a reviewer might anticipate. If the additional material doesn’t merit some discussion in the main paper, then the more elaborate discussion doesn’t belong in the appendix either. As a general rule, we believe that a manuscript must be able to stand on its own. A reader must be able to understand it and find it convincing even without the appendix. The appendix, in other words, should be a place to provide information about “housekeeping” details, not a way to back door in thousands of words you couldn’t fit in the paper itself.

We know that limiting appendix pages can be anxiety-inducing for authors. That’s probably why so many of you request exemptions. But we’ve found that requiring authors to distinguish between what’s essential and what might be extraneous improves the quality of the manuscript and makes the task of reviewing that much easier and more reasonable – something every author appreciates when wearing the hat of a reviewer.

 

The American Journal of Political Science (AJPS) is the flagship journal of the Midwest Political Science Association and is published by Wiley.