More than 100 scientific journals and 31 scientific organizations have signed new publication guidelines to promote openness and integrity in the scientific literature.

A committee of disciplinary leaders and experts, journal editors, and funding agency representatives congregated last November in Charlottesville, Virginia, to develop the Transparency and Openness Promotion (TOP) guidelines, which were published recently in Science.

Most scientists assume that transparency, openness, and reproducibility are a routine part of daily practice, but increasing evidence points to the contrary. Individual researchers lack the desire to be more open and transparent, and academic reward systems don’t do enough to recognize such practices, authors of the guidelines explained in the journal article.

“Unfortunately, there is no centralized means of aligning individual and communal incentives via universal scientific policies and procedures,” the article stated.

The antidote was to develop eight open practice standards that seek to improve upon the current incentives system and encourage transparency and openness among researchers. Each standard has three levels of stringency, to reflect the fact that not all of them would universally apply to all disciplines or journals.

Each of the standards “are modular, facilitating adoption in whole or in part. However, they also complement each other, in that commitment to one standard may facilitate adoption of others. Moreover, the guidelines are sensitive to barriers to openness by articulating, for example, a process for exceptions to sharing because of ethical issues, intellectual property concerns, or availability of necessary resources,” the article stated.

Two of the standards, citation and replication, seek to acknowledge the work researchers do to promote open practices. Under the third level or most stringent standard for citation, for example, a scientific article would not get published unless there was appropriate citation on data and materials that follow the publication’s author guidelines. Under level 2 for the replication standard, the journal would encourage the submission of replication studies and conduct a blind review of results.

Several other standards describe how openness plays into reproducing and evaluating research. Reproducibility “increases confidence in results and also allows scholars to learn more about what results do and do not mean. Design standards increase transparency about the research process and reduce vague or incomplete reporting of the methodology,” the article stated. Standards on research materials, data sharing, and analytic methods also fall into this category.

Finally, two standards address the value of preregistering studies to encourage discovery in scientific research. Preregistering analysis plans, for example, “certify the distinction between confirmatory and exploratory research, or what is also called hypothesis-testing versus hypothesis-generating research. Making transparent the distinction between confirmatory and exploratory methods can enhance reproducibility, “the article stated.