Celebrating Verification, Replication, and Qualitative Research Methods at the AJPS

By Jan Leighley, AJPS Interim Lead Editor

I’ve always recommended to junior faculty that they celebrate each step along the way toward publication: Data collection and analysis—done! Rough draft—done! Final draft—done! Paper submitted for review—done! Revisions in response to first rejection—done! Paper submitted for review a second time—done! In that spirit, I’d like to celebrate one of AJPS’s “firsts” today: the first verification, replication, and publication of a paper using qualitative research methods, “The Disclosure Dilemma: Nuclear Intelligence and International Organizations (https://doi.org/10.1111/ajps.12426)” by Allison Carnegie and Austin Carson.

The Disclosure Dilemma: Nuclear Intelligence and International Organizations Allison Carnegie Austin Carson

As with many academic accomplishments, it takes a village—or at least a notable gaggle—to make good things happen. The distant origins of the AJPS replication/verification policy were in Gary King’s 1995 “Replication, Replication” essay, as well as the vigorous efforts of Colin Elman, Diana Kapiszewski, and Skip Lupia as part of the DA-RT initiative that began around 2010 (for more details, including others who were involved in these discussions, see https://www.dartstatement.org/events ), and many others in between, especially the editors of the Quarterly Journal of Political Science and Political Analysis. At some point, these journals (and perhaps others?) expected authors to post replication files, but where the files were posted, or if publication was contingent on posting such files, varied. They also continued the replication discussion that King’s (1995) essay began, as a broader group of political scientists (and editors) started to take notice (Elman, Kapiszewski and Lupia 2018).

In 2012, AJPS editor Rick Wilson required that replication files for all accepted papers be posted to the AJPS Dataverse. Then, in 2015, AJPS editor Bill Jacoby announced the new policy that all papers published in AJPS must first be verified prior to publication. He initially worked most closely with the late Tom Carsey (University of North Carolina; Odum Institute) to develop procedures for external replication of quantitative data analyses. Upon satisfaction of the replication requirement, the published article and associated AJPS Dataverse files are awarded “Open Practices” badges as established by the Center for Open Science. Since then, the staff of the Odum Institute and our authors have worked diligently to assure that each paper meets the highest of research standards; as of last week, we had awarded replication badges to 185 AJPS publications.

In 2016, Jacoby worked with Colin Elman (Syracuse University) and Diana Kapiszewski (Georgetown University), co-directors of the Qualitative Data Repository at Syracuse University, to develop more detailed verification guidelines appropriate for qualitative and multi-method research.  This revision of the original verification guidelines acknowledges the diversity of qualitative research traditions, clarifies differences in the verification process necessitated by the distinct features of quantitative and qualitative analyses, and different types of qualitative work. The policy also discusses confidentiality and human subjects protection in greater detail for both types of analysis.

But it is only in our next issue that we will be publishing our first paper (available online today in Early View with free access) that required verification for qualitative data analysis, “The Disclosure Dilemma: Nuclear Intelligence and International Organizations (https://doi.org/10.1111/ajps.12426)” by Allison Carnegie and Austin Carson.  I’m excited to see the AJPS move the discipline along in this important way! To celebrate our first verification of qualitative work, I’ve asked Allison and Austin to share a summary of their experience, which will be posted here in the next few weeks.

As part of the efforts of those named here (and those I’ve missed, with apologies), today the AJPS is well-known in academic publishing circles as taking the lead on replication/verification policies—so much so that in May, Sarah Brooks and I will be representing the AJPS at a roundtable on verification/replication policies at the annual meeting of the Consortium of Science Editors (CSE), an association of journal editors from the natural and medical sciences. AJPS will be the one and only social science journal represented at the meeting, where we will  discuss what we have learned, and how better to support authors in this process.

If you have experiences you wish to share about the establishment of the replication/verification policy, or questions you wish to raise, feel free to send them to us at ajps@mpsanet.org. And be sure to celebrate another first!

Cited in post:

King, Gary. 1995. “Replication, Replication.” PS: Political Science and Politics. 28:3, 444-452. https://doi.org/10.2307/420301

Elman, Colin, Diana Kapiszewski and Arthur Lupia. 2018. “Transparent Social Inquiry: Implications for Political Science.” Annual Review of Political Science 21, 29-47. https://doi.org/10.1146/annurev-polisci-091515-025429

Peer Review Week at AJPS: Better Late than Never (Just like Reviews!)

By the AJPS Editorial Team

Who knew that September 10-15 was “Peer Review Week,” a celebration held in honor of the essential role of peer review in the academy and scholarly publications? We didn’t, until the week-long celebration was nearly over. Nonetheless, we made it a memorable week on the AJPS reviewer front by updating the “reviewer guidelines” at www.ajps.org. Please take a quick look at those details before submitting your next review.

Here, we offer some additional thoughts about doing reviews. Doing reviews often feels like a thankless task, one that takes time away from the more pressing matters in academic life. But we are keenly aware that, without able and willing reviewers, the entire peer-review enterprise would collapse. While some of these comments reflect our take on reviews and reviewing, we suspect that they might be shared by other editors in the discipline.

Why should I review? Participating in the peer review of research articles contributes to your scholarly community. It is a way to keep abreast of new ideas and new approaches. It offers an opportunity to use the expertise you’ve developed, via years of study and authoring papers, to advance our collective knowledge. And if the collective, altruistic view of reviewing doesn’t strike a chord, then how about this: it’s a great way to learn something—about substance, method or writing, to name a few.

For whom should I review? Many of us receive more review requests than we could ever accept, and sometimes from journals that seem a bit outside of our usual purview. Review for the journals where you want to publish, and for the journals you regularly read. Reviewing for and publishing in more general journals allows you to have a voice in the leading scholarship in your field. And whatever the impact factor, reviewing for more specialized journals usually shifts the focus to greater in-depth material and may benefit your own work even more.

How many reviews should I do? People sometimes don’t have a sense of how often they should review. Often Editorial Board members are expected to review more than others; we try to not ask for reviews from others more than a couple times a year. But it varies a lot, both here at AJPS and at other journals, and we as a discipline don’t coordinate on counting up the number of reviews across journals (although Publons may help a tiny bit with this). Data on the peer review process suggests that the burden of reviewing is unevenly distributed. In fact, one reason reviewers decline our invitations is that our “ask” has arrived when the potential reviewer already has committed to reviewing multiple papers, and they can’t add ours to the list.

Some editors like to remind their colleagues that, if they are getting three reviews every time they submit a piece, then they need to be doing three times their number of journal submissions/re-submissions in a given year. At AJPS, we now ask anyone who submits an article to commit to doing two reviews during the coming year. We appreciate that review requests sometimes come at inopportune times (whether or not you have other review commitments in line ahead of AJPS). In other cases, potential reviewers do not feel as expert as they might like. We (and most) editors appreciate why reviewers may need to decline in these situations. But every editor loves it when a declined reviewer invitation is accompanied by suggestions of potential alternate reviewers. Our reviewer database is a work in progress, and we certainly do not know people in every nook and cranny of the profession. Adding new reviewers to our database is critical to the editorial process, so making reviewer recommendations is incredibly helpful. You might also think of making recommendations as an opportunity to share the love with your friends, colleagues, and co-authors!

Why was I invited to review? The peer review process is one that is premised on expertise: you are invited to review because the editors believe you have the expertise to evaluate the argument, claims, and methods used in the paper. That said, you may not be an expert on every aspect, and do not need to address each of these aspects. You can focus your comments on the issues you believe to be most important as a basis for making a recommendation. Often editors will invite reviewers who have published research on similar—but not the same exact—questions to provide some diversity across reviewers. This is especially true at less specialized journals, where editors might want to know what scholars in the same subfield—but with different research expertise—think about the paper’s argument, importance, or potential (broader) impact.

What if I know who the author is? Convey this information to the editorial office immediately. It is helpful to explain for how long, and in what capacity, you have known the author and how you know the paper. Some editors will immediately release you from the review invitation (and appreciate being able to do this); others will judge whether the circumstances are sufficiently compromising that they need to find another reviewer. Or the editors may ask the invited reviewer to complete the review if they feel that they can offer an unbiased, serious review independent of knowing the author’s identity.

What if I’ve reviewed the manuscript for another journal? Convey this information to the editorial office as well. Many editors will leave the decision up to the reviewer; some will release you from the review. The basic principle here is that the reviewer and editor should be in agreement as to whether doing the review is the right thing.

What should I include in the review? Many journals will describe what they want in a review when they invite you, or post these details on the journal’s website. Oftentimes this comes in a series of questions that might structure the review: what is the paper’s theoretical motivation? Its theoretical or empirical contribution? Is the method appropriate to question? How persuasive is the empirical evidence? Is the author’s interpretation of the evidence accurate? Is it appropriate? What are the paper’s strength and weaknesses? Can the weaknesses be addressed? While providing a summary of the paper (from a few sentences to a lengthier paragraph) can help set the stage for the review, the more important details are your answers to the questions posed by the editors.

In addition, if you are really impressed by a paper, please tell us why. Sometimes reviewers are quicker to criticize than to praise. While we understand (and often share) that urge, keep in mind that we are seeking reasons to accept or advance a piece, as well as reasons to decline it. If you love a manuscript, don’t be afraid to advocate for it. You may find this piece, written by Sara Mitchell (an AJPS Editorial Board member) to be useful.

Will the editor follow my recommendation? Sometimes they will; sometimes they won’t. At most journals, reviews are advisory to the editor. And it is important to remember that, regardless of the time invested in the review, the editor has more information than what any single reviewer has. She knows who the members of the review panel are and what their review histories are (some reviewers inflate their grades, whereas others do not!). She also often receives private “to the editor” comments from reviewers, which sometimes emphasize certain points made in the review. Moreover, the editor sees reviews for dozens or even hundreds of manuscripts per year. While each manuscript decision is dependent on the reviews that are submitted, the editors have some experience to assess what good and bad reviews look like; and what a reasonable amount of work for a revision would be.

With all that said… We now declare this week as Peer Review Week at AJPS! So send in those reviews (early, late or on deadline). And take a few minutes to update your contact details and research interests in our Editorial Manager database, so that we know how (and for what) to invite you to review. If you do not have an existing Editorial Manager profile and want to review, begin one—and send along your C.V. to the editorial office at ajps@mpsanet.org so we can learn more.

ICYMI: New People, New Policies

By Jan Leighley, AJPS Interim Lead Editor

Summer breaks are never long enough, eh? It’s back to “normal” business as of earlier today, when the Editorial Manager submissions portal re-opened for new business. During our break, I posted some details about the updated submissions guidelines, so be sure to take a careful look at some of the changes that we have made (more on those below).

The other change that we made over break was shifting the editorial staff responsibilities from Michigan State University to American University. The transition for staff responsibilities reflected a fundamental change in the editorial structure—with associate editors identifying reviewers and drafting decision letters based on their readings of the manuscript and the reviews, and letters that I send out after reviewing the manuscript and its reviews. One of the biggest changes has been having Marty Jordan, previously the Managing Editor, shift to a new role as Production Editor. In this capacity, Marty will deal with all matters relating to production as well as replication. At AU, Julia Salvatore has taken over the responsibilities of the ever-important AJPS inbox, managing the stream of incoming emails, as well as other matters relating to the editorial procedures and office management. So when you email AJPS@mpsanet.org, you’ll most likely be sending Julia an email, and she will ably respond or move it along to whoever can respond. We also have new staff at AU who will assist with technical checks. This includes Ryan Detamble, a Ph.D. student in the Department of Government.

I wrote last week about changes in the submissions process. We will do our best—with new staff as well as updated guidelines—to move manuscripts through technical checks as fast as possible, but need your help to do so. Please send any questions about new submissions—as well as manuscripts already under review—to AJPS@mpsanet.org, and we will respond as quickly as possible.

A few notes about those updated submission guidelines. In my post last week, I noted the changes but didn’t spend much time explaining the rationale for those changes, and that is: to improve the peer review process. Providing the names of co-authors from the past five years for every author on submitted papers will allow us to avoid “obvious” conflicts of interest that are both difficult and time-consuming for either editors or staff to identify. Additional details regarding author anonymity—and the need to disclose whether the manuscript being submitted is part of a larger project—is essential to allowing the editors (if not reviewers) be able to assess the independent contribution of each manuscript we review. (If you don’t know what “salami-slicing” is in the editing/academic publishing world, maybe ask about that at APSA…) We don’t want to publish papers that are re-treads, or that make marginal advances—which means that the AJPS editorial team and reviewers need to be able to evaluate every manuscript in light of what it contributes, aside from related book projects, other publications—and even related papers under review. We encourage you to follow the updated guidelines  as closely as possible—but also to please email with any questions or concerns you have about related work before you submit your manuscript.

Finally, we announced last week that generally we expect supplemental information (SI) files to be no longer than 20 pages. Our primary goal here is to enhance the use of the SI information as part of the review process—which means for some authors, more focus and intentionality in what and how much is provided as a supplemental file to allow AJPS to publish papers that stand on their own.  We plan to discuss these policies with editorial board members at APSA and will consider additional updates to the guidelines as needed.  Until then, we ask that authors work toward limiting SI files to 20 pages, and send any questions about this to AJPS@mpsanet.org.

With the Editorial Manager portal open again—and the deadline for APSA papers approaching (or past?)—it will be a busy week. I hope it’s a good one on all fronts, as the summer draws to a close.

 

 

Some Details about New AJPS Submission Requirements

By Jan Leighley, AJPS Interim Lead Editor

I’m a firm believer in celebrating every step along the way when the goal is to publish research at a peer-reviewed journal: when the manuscript is “done”; when it’s submitted; when there is a decision (whether positive or not, and whether final or not); when page proofs arrive; when page proofs are completed; when the manuscript is published online; and when the print version arrives. As with many aspects of academic life, publishing always takes longer, and is always more complicated, than one would like.

Celebrating the submission of a manuscript to the AJPS has never been more important—in part because of its high impact ranking, but also (on a more practical level) we are now asking authors to do a bit more as they submit manuscripts. Effective immediately, we have added several new details to the submission process. All authors who plan to submit a manuscript should take a look at the updated submission guidelines we have posted online before finishing the manuscript for submission.

One of the key changes we have made is to limit the length of Supporting Information documents to 20 pages or less. While it is true that online “space” (where Supporting Information documents are published) is unlimited, the time and attention of editors and reviewers are not. We hope this page limit results in more thoughtful and focused decisions about what additional details are provided—but also helps to produce papers that can “stand alone,” without a seeming endless dumping of additional details and analysis into the ever-present “Supporting Information” file.

The new manuscript guidelines also provide more details about how we expect author anonymity to be maintained in the manuscript. Here, we also now ask corresponding authors to provide details about other related papers under review, or book manuscripts in development. We hope this clarifies what information authors are expected to provide to allow reviewers to assess the manuscript’s theoretical and empirical contributions, independent of other related work. Whether cited in the submitted paper or not, if other papers “in progress” overlap with the AJPS submission, we want to know about them.

Related, we now ask corresponding authors to provide the names of co-authors from the past five years for every author of the submitted manuscript, along with identifying each author’s dissertation chair. This allows us to better avoid conflicts-of-interest as we invite reviewers on manuscripts—in a growing, increasingly complex discipline that now reaches across continents. Though we are each experts in our respective subfields (and more, at times), we simply cannot know all the professional and personal connections that might compromise the peer-review process.

And, finally, we have implemented procedures to implement the MPSA council policy regarding editorial conflicts of interest. All authors should review that policy so that the manuscript’s corresponding author is able to identify any potential conflicts of interest with the current editorial team. As dictated by council policy, the MPSA Publishing Ethics Committee (chaired by Sarah Binder) provides guidance on some cases where an alternative editorial process is required.

Thanks to all of our authors for sending their best work to us—and helping us to provide the most efficient and rigorous review process possible.  As always, send questions about the submission and review process to ajps@mpsanet.org, and best wishes for the start of the new semester.

 

What AJPS is Publishing When, and Staff Transitions

By Jan Leighley, AJPS Interim Lead Editor

After a whirlwind editorial transition, we are looking forward to a reprieve from the daily submissions that require our attention. We are taking a two-week break from the workload (instead of the usual month-long hiatus), which will allow us to catch up on decisions, pester reviewers and tend to the usual pre-APSA and Fall Semester preparations.

Of course, the two-week break will not slow down the publication of accepted papers; volume 62:3 was just released in July. All of the papers in 62:3 were shepherded through review and publication by Bill Jacoby and his staff. The same will hold true for nearly all the papers in 62:4—to be published in October—where we have had the privilege of moving some of the papers into the final decision or production stages.

Substantively, I have been surprised by the large number of comparative politics submissions we are receiving—though a careful review of 62:3 and 62:4 should have been an obvious sign that, contrary to some rumors, the AJPS publishes far more than “just” American politics. Related, I have received emails asking whether the AJPS publishes qualitative research papers, as rumor has it the AJPS does not. Just to clarify: come August 20, the AJPS will be open for submissions again, and we welcome submissions across all subfields, and all methodological approaches. As a general journal, we seek to publish the best work across the discipline, papers that offer theoretical, methodological and empirical advances.

I am also using the time over break to make the staff transition from Michigan State to American University, which we couldn’t do during the abbreviated transition period in the spring. I am grateful to the MPSA, Michigan State and, most importantly, the exceptional staff that continued to work for us as we sorted out editorial issues. Marty Jordan, who was Managing Editor for the past year, continued to play this key role for the past few months, while Nathaniel C. Smith and Jessica A. Schoenhoerr also continued in their roles as Editorial Assistants. We simply could not have caught up from the month-long hiatus, and starting to review manuscripts (old and new) without each of them continuing to do the fine work they had done over the past year.

Aside from manuscript decisions, though, who is doing what here in the editorial office will change. I am pleased to introduce Julia Salvatore as our new Editorial Administrative Assistant. Julia will be, most importantly, managing the ajps@mpsanet.org inbox, responding to author, reviewer and associate editor queries, among other office management tasks. Marty Jordan will be shifting to Production Editor, handling all post-decision matters associated with the publication of accepted papers, including replication and post-production communications.

I thank you for your patience as we shift Marty, Julia and others into new and different responsibilities and tasks. Hopefully the slower pace of the editorial office during the break will minimize any disruptions due to the staff transition. As always, send questions or concerns to ajps@mpsanet.org.

With Thanks to Our Reviewers and Editorial Board Members

One of the ways that peer-reviewed journals advance the frontiers of scholarly knowledge in political science is by highlighting the best—theoretical and empirical—work of political scientists. Another way is in providing academics with the opportunity to engage in a “virtual colloquium” when they submit papers for review. Though anonymous, reviewers provide an invaluable service by engaging with the ideas, arguments and evidence presented in our manuscripts.

As an editorial team, we are impressed daily with the quality of the reviews we receive. We value the time and contributions of our reviewers, as we cannot make the right decisions on manuscripts without relying on the expertise of our reviewers’ comments. We publish a list of our reviewers on the journal’s website every year, but this is hardly sufficient recognition of our appreciation—or the reviewers’ contributions. So, to those of you who take time out of your busy work schedules and respond positively to our invitations to review manuscripts, I would like to say: thank you.

I would also like to thank those of you who have agreed to serve on our editorial board this year.  This distinguished group of scholars—whom I’ve informed will likely be asked to do more reviews, in less time, than others who review for us—will also advise us on editorial policies, and provide us critical feedback on a variety of issues relevant to the peer-review process at AJPS. We will have our first editorial board meeting in Boston during the APSA meeting in late August/early September, and we look forward to discussing a wide range of issues then. Be sure to watch for policy updates on the AJPS website that result from that meeting.

Until then, we will continue to provide the most efficient and relevant reviews of the papers that are submitted to AJPS. We do plan to close the AJPS portal to new submissions from August 4 through August 19. Since this isn’t the typical month-long hiatus, we needed to call it something else. We thought about calling this our “August recess,” following the time-honored tradition observed down the street from AU, the Congressional Recess. Some of us thought recess sounded like elementary school, and suggested instead that it be our “August break.” But since we’re only closing to submissions and otherwise making decisions, calling it a break made it sound like more of a vacay than is the case. Finally, we considered taking an “August holiday,” but, well, that just sounds silly. So, bottom line: we’re working, the first two weeks of August, just not accepting new submissions.

Hope your recess/break/vacay/holiday this summer is a good one!

Jan Leighley, Interim Editor

Two Weeks In: An Update from Lead Editor Jan Leighley

It’s hard to believe that it’s only been two weeks since we re-started AJPS editorial operations. Since then, we’ve had over 60 manuscripts submitted, and have those papers out for review. We’ve also worked to get reviews on papers that needed new reviewers, or had reviewers that needed to be pestered (apologies!) . . . which reminds me of the critical importance of reviewers in the peer-review process. We will recognize our reviewers by posting the entire list here at www.ajps.org at the end of the year. Until then, know that original invitations—or pesters about still needing reviews—reflect the editorial team’s reliance on experts across the discipline. Thank you for contributing your time and energy to make this work.

Speaking of time and energy: I’m pleased to announce that Layna Mosley (University of North Carolina) has agreed to join us as associate editor. Layna Mosley is Professor of Political Science at University of North Carolina at Chapel Hill. Her research and teaching focus on international relations, international political economy, and comparative political economy, as well as international relations. She is the author of two books, Labor Rights and Multinational Production and Global Capital and National Governments, and served as editor of Interview Research in Political Science. Her research, which has been supported by Fulbright, has appeared in numerous academic journals including Journal of Conflict Resolution, the American Journal of Political Science, Oxford Research Encyclopedia of Politics, New Political Economy, Human Rights Quarterly, American Political Science Review, and Comparative Political Studies. Mosley received her Ph.D. in Political Science from Duke University.

While we are still working on some transition issues, we know that Layna joining the team means that all of us will be happy to shift even more fully to focusing on the intellectual and scholarly work that makes the AJPS what it is, and worrying less about the editorial office. That work, of course, reflects some of the best in the discipline, and we look forward to seeing more submissions over the next few months.

Jan Leighley, Interim Editor

Introducing the New Editorial Team

As MPSA President Elisabeth Gerber announced on May 3, the Council voted to appoint me as Lead Interim Editor for June 2018-June 2019, in anticipation of a successful search for the next four-year editorial team. With previous editorial service to the American Journal of Political Science (AJPS) and the Journal of Politics, the challenge of a one-month transition period seemed less daunting in light of the expectation of working with a team of associate editors. In identifying potential associate editors, my first priority was to find leading scholars across subfields of the discipline, ones whose professional values were consistent with the scholarly goals of the AJPS—and who were able to make a commitment to the journal, over other academic obligations, on such short notice.

As that one-month transition period comes to a close, I am pleased to announce that Sarah M. Brooks of The Ohio State University, Mary G. Dietz of Northwestern University, Jennifer L. Lawless of the University of Virginia, and Rocio Titiunik of the University of Michigan have agreed to serve as associate editors.

Sarah M. Brooks is Professor of Political Science and 2018-2019 Huber Faculty Fellow at The Ohio State University. Her research and teaching interests center on comparative and international political economy, Latin American politics and social protection. Brooks is also co-director of the Brazil Working Group at the Center for Latin American Studies, and co-director of the Globalization Workshop at the Mershon Center. She is the author of Social Protection and the Market in Latin America and she has written extensively on the topic of social security and pension reform. Her research, which has been supported by the Gerda Henkel Stiftung Foundation and Fulbright, has appeared in numerous scholarly journals including International Organization, the American Journal of Political Science, the Journal of Politics, World Politics, Comparative Political Studies, and Latin American Politics and Society. Brooks received her Ph.D. in Political Science from Duke University.

Mary G. Dietz is the John Evans Professor of Political Theory and Professor of Political Science and Gender & Sexuality Studies at Northwestern University. Her areas of academic specialization are political theory and the interpretation of texts, with concentrations in feminist theory and politics; democratic theory and citizenship; the history of Western political thought, and contemporary political and social theory. Dietz is the author of Between the Human and the Divine: The Political Thought of Simone Weil and Turning Operations: Feminism, Arendt, and Politics; and editor of Thomas Hobbes & Political Theory. She has also served as editor of Political Theory: An International Journal of Political Philosophy from 2005-2012. Dietz received her Ph.D. in Political Science from the University of California at Berkeley.

Jennifer L. Lawless is Professor of Politics at the University of Virginia. Her research focuses on political ambition, and she is the co-author of Women on the Run: Gender, Media, and Political Campaigns in a Polarized Era, co-author of Running from Office: Why Young Americans Are Turned Off to Politics, and author of Becoming a Candidate: Political Ambition and the Decision to Run for Office. She is also a nationally recognized expert on women and politics, and the co-author of It Still Takes a Candidate: Why Women Don’t Run for Office. Her research, which has been supported by the National Science Foundation, has appeared in numerous academic journals such as the American Journal of Political Science, American Political Science Review, and the Journal of Politics. Lawless received her Ph.D. in Political Science from Stanford University.

Rocío Titiunik is James Orin Murfin Professor of Political Science at the University of Michigan. She specializes in quantitative methodology for the social sciences, with emphasis on quasi-experimental methods for causal inference and political methodology. She is a member of the leadership team of the Empirical Implications of Theoretical Models (EITM) Summer Institute, member-at-large of the Society for Political Methodology, and member of Evidence in Governance and Politics (EGAP). She is also associate editor for Political Science Research and Methods. Her work appears in various journals in the social sciences and statistics, including the American Journal of Political Science, the American Political Science Review, the Journal of Politics, Econometrica, the Journal of the American Statistical Association, and the Journal of the Royal Statistical Society. Titiunik received her Ph.D. in Agricultural and Resource Economics from the University of California at Berkeley.

Soon I expect an additional editor to join the team—more details when I have them. Until then, the five of us will be working to secure high-quality reviews promptly, and identify those manuscripts that satisfy our expectations of intellectual contribution and scholarly impact. Final manuscript publications decisions will be made jointly by the associate editor to whom the manuscript is assigned and myself, with no appeals accepted.

And so we’re off! The year will go by quickly, I’m sure—with thanks in advance to our reviewers (who do the real work) and to Rick Wilson and the rest of the Editorial Search Committee tasked with finding the next editorial team. I am also grateful to the many individuals on the MPSA Council, in the MPSA office, on the Wiley staff and on the MSU editorial staff for getting us going so quickly.

Hope that your summer is as fun and productive as I expect ours to be!

Jan Leighley, Interim Editor

QDR and the AJPS Replication Policy

(Guest Posting by Colin Elman and Diana Kapiszewski)

The Qualitative Data Repository (QDR), located at Syracuse University, ingests, curates, archives, manages, durably preserves, and publishes digital data used in qualitative and multi-method social inquiry.  The repository develops and publicizes common standards and methodologically informed practices for these activities, as well as for reusing and citing qualitative data. As part of this broader undertaking, QDR welcomes the opportunity to work with other organizations and institutions as they pursue their transparency goals. QDR is pleased to have been selected by The American Journal of Political Science (AJPS) to help instantiate part of its revised Replication and Verification Policy.

AJPS has a long-standing commitment to the general principles reflected in the Data Access and Research Transparency (DA-RT) initiative. AJPS considers openness to be a fundamental component of social science. Accordingly, AJPS signed the Journal Editors Transparency Statement (JETS) in October 2014, pledging to implement policies by January 2016 that require authors of evidence-based articles to make as accessible as possible the empirical foundation and logic of inference invoked in their research.

Earlier this year, the Journal clarified and enhanced its Guidelines for Preparing Replication Files. Among other important changes, the Guidelines now provide more comprehensive directions for how scholars of qualitative research and multi-method research with a qualitative component can fulfill openness requirements.  Just as the Journal’s policies with respect to quantitative approaches are instantiated in cooperation with the University of North Carolina’s Odum Institute for Research in Social Science, the Journal’s new qualitative policies will be facilitated by QDR.

AJPS’ editorial position is that it publishes rigorous social science produced using public procedures. Subject to the ethical and legal constraints described in the Guidelines, the Journal takes the view that both data and analysis should be accessible to readers. Moreover, while not mandated by JETS, the journal also undertakes a pre-publication appraisal of the analysis in each evidence-based article that has been accepted for publication.

AJPS recognizes that data access and research transparency should be pursued in ways that are consistent with the type of social inquiry being conducted, the forms of evidence being deployed, the ways in which the data were generated, and the analytical processes that were used.  That said, the Journal is confident that its guidelines will apply to most empirical researchers whose goal is rigorous social science. The ideas underpinning the journal’s commitment to openness comprise a central element of scientific practice, regardless of the subject matter of, specific investigative strategy used in, nature of the data invoked in, or the analytic procedures employed in a particular publication.

Pre-Publication Replication

AJPS’ review process addresses a broad set of questions about the potential contribution of any manuscript to the stock of knowledge on a given topic. The Journal’s replication requirement speaks to a narrower issue. It calls on scholars to make their data and analysis available so that AJPS editors (facilitated by Odum and QDR) can ascertain whether the particular combination of data and analysis produces the claimed result.

AJPS takes the view that, for many types of scholarship, a third party should be able to replicate precisely the steps an author took to analyze her data, and arrive at exactly the same result. Least controversially, repetitions of explicitly algorithmic (often machine-assisted) analysis of a bounded (and typically interval level) dataset should lead to duplicate results. The archetype of scholarship suited to this kind of assessment is the statistical analysis of quantitative data. Certain types of qualitative research, such as automated content analysis and qualitative comparative analysis, are also readily amenable to this type of evaluation.

Replication is more challenging for qualitative research where the data analyzed do not form part of a bounded dataset with explicit codings, or where the mode of analysis is less obviously algorithmic.  Narrative case studies often combine these elements. When strict replication is infeasible, AJPS still requires authors to make their scholarship as understandable and evaluable as possible. Authors of qualitative research, like all AJPS authors, must explicitly state the logic of inference they are invoking, describe their research processes explicitly and precisely, and provide the materials necessary to elucidate how they arrived at their findings and conclusions.

Ethical and Legal Obligations and Transparency

According to AJPS’ Guidelines, authors may request a waiver to transparency requirements where sharing data could put the safety, dignity, or well-being of human participants at risk. Moreover, AJPS readily acknowledges that the person best positioned to assess the risk involved in disclosure is the author. The information that authors provide forms the basis of the Journal Editor’s decision concerning whether to grant the waiver.

AJPS strongly encourages authors not to consider providing access to the data underpinning their research as an “all or nothing” choice. Indeed, many scholars already routinely engage in practices that address the tension between transparency and protecting their human participants. For example, when a scholar quotes an anonymous source she is offering a de-identified version of the data precisely to address this tension. AJPS’ transparency requirements simply obligate scholars to render such choices patent and explain them. Moreover, the data management community is developing increasingly sophisticated mechanisms for allowing a reduced or modified view of data while protecting human participants, and AJPS encourages authors to use them. AJPS also understands that, in some situations, no mechanisms or strategies will effectively address human participants concerns, inhibiting the sharing of data that are associated with those project participants.

All AJPS authors must respect proprietary restrictions and copyright. As with human participants, however, it may be possible for some data under these types of constraints to be shared. For qualitative sources, for example, the “Fair Use” exemption outlined in the US 1976 Copyright Act suggests that some (small) portion of different types of copyrighted materials can under certain circumstances be shared for non-commercial use, or for the purposes of private study, teaching, or criticism/review.

Additional Observations

Recent discussions of qualitative data access and research transparency reflect some anxiety among scholars about what impact meeting these obligations may have on them and their work. We hope the information above, and the following observations, will help to address some of these concerns.

First, while there have been some disagreements about how openness is best achieved, the great majority of contributions to the conversation have accepted the general principle that openness facilitates the understanding and evaluation of published claims. AJPS’ policy is consistent with this widely shared consensus.

Second, all advocates of openness likewise recognize that it is an ideal that sometimes has to be modified in practice given competing imperatives. For example, AJPS recognizes that concerns about human participants require a good faith dialogue between authors and the journal. Authors identify data constraints when they submit their manuscript, and the editor communicates the Journal’s decision about how the journal will proceed with respect to those constraints prior to review. This exchange provides the author and the editor with a common understanding of how the data will be managed, and of the implications of any constraints they are under for replication and subsequent sharing, before the article is sent for review.

Third, only the data used to produce the results discussed in the publication need to be provided in order to comply with transparency requirements.  For example, a quantitative replication dataset need not include all the variables in the study dataset from which it was drawn, but rather just the variables included in the analysis.  Authors of qualitative scholarship are likewise only required to share the data underpinning central or contested empirical claims in their article. Beyond these minimum requirements, all authors need to make pragmatic judgements about how much data are needed to illustrate the empirical basis of their inquiry and make it fully and fairly evaluable.

As we hope is clear from the changes being introduced, AJPS welcomes submissions from all research traditions engaged in rigorous social science. We hope that the revised AJPS policy will be regarded as an on-ramp, and not a roadblock, for qualitative research.

Colin Elman, Syracuse University
Co-Director, Qualitative Data Repository and Methods Coordination Project

Diana Kapiszewski, Georgetown University
Co-Director, Qualitative Data Repository

AJPS to Award COS Open Practice Badges

By William G. Jacoby

The American Journal of Political Science has demonstrated its commitment to data access and research transparency over the past year through its rigorous replication and verification policy. Starting immediately, the AJPS will provide more visible signals of its adherence to these principles by adopting two of the “Badges to Acknowledge Open Practices” from the Center for Open Science (COS). Specifically, we will use the “Open Data” and “Open Materials” badges illustrated below. According to the COS guidelines,”(t)he Open Data badge is earned for making publicly available the digitally-shareable data necessary to reproduce the reported results.” Similarly, the guidelines state that “(t)he Open Materials badge is earned by making publicly available the components of the research methodology needed to reproduce the reported procedure and analysis.” Thus, the badges are intended to be a salient indicator that the articles to which they are awarded conform to the principles and best practices of openness in scientific research.

COS Badges

Any manuscript that has been accepted for publication at the AJPS and successfully completed the data replication and verification process will automatically meet the criteria for the Open Data and Open Materials badges. Therefore, upon release of the replication Dataset on the AJPS Dataverse, these two badges will be added to the metadata of the Dataverse Dataset. The badges appear near the bottom of the main page for the article’s Dataverse Dataset, along with the statement, “The associated article has been awarded Open Materials and Open Data badges. Learn more about Open Practice Badges from the Center for Open Science.” When the article, itself, is published, the badges will appear with the information near the beginning of the electronic version in the Wiley Online library. And they will be included as part of the statement about replication materials on the first page of the article’s print version.

Of course, some articles published in the American Journal of Political Science will not receive the Badges. For example, many formal theory manuscripts and virtually all of the normative theory manuscripts that are submitted to the Journal do not contain any empirical analyses. Such work is exempt from the AJPS Replication Policy, so the Open Practice Badges are not relevant to these manuscripts. And there are certain situations in which a manuscript may be given an exemption from the usual replication requirements due to the use of restricted-access data. In such cases, authors still are asked to explain how interested researchers could gain access to the data and to provide all relevant software code and documentation for replicating their analyses. Manuscripts in this situation would not receive the Open Data Badge, but they would be awarded the Open Materials Badge. Even with allowances for exceptions, we anticipate that the vast majority of the articles published in the American Journal of Political Science will receive both Badges.

The AJPS will be the first journal in political science to award Open Practice Badges to articles. Currently, the Badges are used by five other journals– four in psychology and one in linguistics. The Badges already appear in the AJPS Dataverse Datasets for all qualified articles (i.e., those that have successfully completed the replication and verification process). And starting today (May 10, 2016) they will appear in all articles published online in the Early View queue within the Wiley Online Library. Of course, this carries over to the print versions of the articles. The Open Practice Badges serve a useful purpose by helping to emphasize the distinctive quality of the work that appears in the American Journal of Political Science.

 

The American Journal of Political Science (AJPS) is the flagship journal of the Midwest Political Science Association and is published by Wiley.