The Data Access and Research Transparency (DA-RT) initiative is driving a wedge into the community of scholars. Generally, I must admit that I can certainly not speak with great authority on this subject having only recently finished my PhD. Moreover, I am not exactly a method-driven scholar even though I do read methodological works with genuine interest. But since I have recently put my name under a petition requesting a delay and further discussion of DA-RT, I feel compelled to record my stance on the matter in greater detail.
Essentially, DA-RT requires scholars that cited data are available at the time of publication through a trusted digital repository. The American Journal of Political Science (AJPS), for example, in its guidelines on preparing replication files (p. 24) states that all accepted manuscripts need to provide replication files before the article enters the production stage through the Harvard Dataverse Network. DA-RT has been lucidly analyzed by much more established scholars than me (see, for example, Jeffrey Isaac’s editorial in Perspectives on Politics and his blog entry as well as Rick Wilson’s blog entry on the subject). Nevertheless, in what follows I would like to share what are to me more practical deficiencies of DA-RT that I have not seen addressed yet, particularly concerning questions of public access to data.
DA-RT seems to have been drafted primarily with quantitative research in mind, as the above arrangement with the Odum Institute underlines. But its standards are extending to qualitative research as well (see, for example, this newsletter of the APSA section for qualitative and multi-method research). Scholars relying on interviews or field notes in their research, for example, may therefore be equally required to upload digital transcripts to an online repository. Again, I have three concerns. First, this would require researchers to prepare full transcripts in the first place. This comes with a significant additional workload and takes time away from areas where it might be more wisely spent. Not all of us will be able to hire research assistants to clear this hurdle effortlessly. Second, if I conduct a series of one-hour interviews on Free Trade Agreements of which I spend ten minutes talking about negotiating directives and finally publish an article on this particular aspect, will I be required to provide full transcripts or only the ten minutes in each interview that served as the data for this piece? Third, the question of liability by using digital repositories does not go away. Whether all of this will affect interviewees’ disposition to talk to scholars remains to be seen but cannot be dismissed out of hand.
None of the above should be interpreted as being directed against data access and replicability, in principle. What makes this debate so delicate is that everyone voicing concerns about DA-RT is faced with the assumption of attempting to cover up bad research practices. Formally delaying DA-RT may be unwarranted. Editors are experienced scholars that will use their discretion to tailor these standards to the needs of individual pieces of research. But let me end with two more general observations. First, no standard will prevent individual scholars from cheating if they really want to. DA-RT may not even raise the bar significantly for those that do. Second, DA-RT will require only small adjustments for scholars publishing in the journals subscribing to it. Much of what is included is best practice in top-quartile political science journals already. In this sense, the argument that DA-RT stirs may be a boon because it focuses attention on general principles underpinning “good” scholarly practice. But it is neither a panacea rooting out “bad” research nor should we let a few rotten apples spoil our most valuable currency – the trust that all of us conduct their research in good faith. If disagreeing with DA-RT is interpreted primarily as an attempt to cover up flawed research, the process may turn out to become a bane for the community of scholars after all.