bims-skolko Biomed News
on Scholarly communication
Issue of 2019‒09‒29
thirteen papers selected by
Thomas Krichel
Open Library Society


  1. PLoS One. 2019 ;14(9): e0223116
      OBJECTIVE: To conduct a time-cost analysis of formatting in scientific publishing.DESIGN: International, cross-sectional study (one-time survey).
    SETTING: Internet-based self-report survey, live between September 2018 and January 2019.
    PARTICIPANTS: Anyone working in research, science, or academia and who submitted at least one peer-reviewed manuscript for consideration for publication in 2017. Completed surveys were available for 372 participants from 41 countries (60% of respondents were from Canada).
    MAIN OUTCOME MEASURE: Time (hours) and cost (wage per hour x time) associated with formatting a research paper for publication in a peer-reviewed academic journal.
    RESULTS: The median annual income category was US$61,000-80,999, and the median number of publications formatted per year was four. Manuscripts required a median of two attempts before they were accepted for publication. The median formatting time was 14 hours per manuscript, or 52 hours per person, per year. This resulted in a median calculated cost of US$477 per manuscript or US$1,908 per person, per year.
    CONCLUSIONS: To our knowledge, this is the first study to analyze the cost of manuscript formatting in scientific publishing. Our results suggest that scientific formatting represents a loss of 52 hours, costing the equivalent of US$1,908 per researcher per year. These results identify the hidden and pernicious price associated with scientific publishing and provide evidence to advocate for the elimination of strict formatting guidelines, at least prior to acceptance.
    DOI:  https://doi.org/10.1371/journal.pone.0223116
  2. J Clin Epidemiol. 2019 Sep 18. pii: S0895-4356(19)30222-7. [Epub ahead of print]
      Data extraction from reports about experimental or observational studies is a crucial methodological step informing evidence syntheses, such as systematic reviews (SRs) and overviews of SRs. These discrepancies were defined as pairs of statements that could not both be true. Authors of SRs and overviews of SRs can encounter reporting discrepancies among multiple sources when extracting data - a manuscript and a conference abstract, a manuscript and a clinical trial registry. However, these discrepancies can also be found within a single manuscript published in a scientific journal. Hereby we describe examples of internal reporting discrepancies that can be found in a single source, with the aim of raising awareness among authors of SRs and overviews of SRs about such potential methodological issues. Authors of SRs and overviews of SRs should check whether the same information is reported in multiple places within a study, and compare that information. Independent data extraction by two reviewers increases the chance of finding discrepancies, if they exist. We provide advice on how to deal with different types of discordances and how to report such discordances when conducting SRs and overviews of SRs.
    Keywords:  data; data extraction; discrepancies; errors; reporting; systematic reviews
    DOI:  https://doi.org/10.1016/j.jclinepi.2019.09.003
  3. Teach Learn Med. 2019 Sep 23. 1-6
      Problem: Traditionally, journal editors expect individuals to complete peer reviews of submitted manuscripts on their own. Recently, a number of editors of health sciences journals have begun to support, and even espouse, the practice of group peer review (GPR). With GPR, multiple individuals work together to complete the review with permission from the journal editor. Motivated by the idea that GPR could provide a meaningful service learning experience for participants in an interprofessional educational scholarship course, we conducted three such reviews and subsequently reflected on our experience and the lessons we learned. We frame our reflections using guiding principles from the domains of peer review, professional development, and educational scholarship. Intervention: The course director arranged for manuscripts to review with the editors of three health sciences journals. Each GPR occurred during a separate weekly session of the course. Each GPR was completed using a similar set of steps, which included (a) gaining familiarity with review criteria, (b) reading aloud and discussing the manuscript's abstract as a class, (c) reading and critiquing assigned sections as individuals and then small groups, (d) building consensus and sharing notes, (e) having the course director synthesize notes into a single review for submission to the journal. Context: The course on educational scholarship involved 15 faculty representing faculty from the University of Utah's School of Medicine, College of Nursing, College of Pharmacy, College of Health, and School of Dentistry. The course director led three GPR sessions mid-way through the yearlong course. Impact: Participants' reflections indicate that GPR (a) conformed to principles of effective peer review; (b) resulted in a meaningful service learning experience within a formal professional development program, deepening understanding of core concepts of educational scholarship; and (c) represented an authentic example of engaging in educational scholarship (i.e., designing and evaluating an intervention while drawing upon and contributing to a body of shared understanding within a community of practice). Lessons Learned: Our principles-based approach to completing GPR within a professional development course on educational scholarship can serve as a model for others to follow. A rigorous, meaningful group review can occur in 1 hour using a combination of group and individual activities focused on matching review criteria to the submitted manuscript. As a result, we continue to include GPR in future offerings of this interprofessional course on educational scholarship, and we continue to study ways to optimize its value as a service learning experience.
    Keywords:  Faculty development; educational scholarship; group peer review; peer review
    DOI:  https://doi.org/10.1080/10401334.2019.1657870
  4. BMJ Open. 2019 Sep 20. 9(9): e028732
      OBJECTIVE: The peer review of completed Patient-Centered Outcomes Research Institute (PCORI) funded research includes reviews from patient reviewers (patients, caregivers, and patient advocates). Very little is known about how best to support these reviewers in writing helpful comments from a patient-centred perspective. This study aimed to evaluate the effect of a new training in peer review for patient reviewers.DESIGN: Observational study.
    SETTING: Online.
    PARTICIPANTS: Adults registered in the PCORI Reviewer Database as a patient stakeholder.
    INTERVENTION: A new online training in peer review.
    MAIN OUTCOME MEASURES: Changes in reviewers' knowledge and skills; change in self-efficacy and attitudes, satisfaction with the training and perceived benefits and relevance of the training.
    RESULTS: Before-after training survey data were analysed for 37 (29.4% of 126) patient reviewers invited to participate in an online training as part of a quality improvement effort or as part of a PCORI peer review. The reviewers improved their answers to the knowledge questions (p<0.001, median number of answers improved 4 (95% CI 3 to 5), large effect size (ES) Cohen's w=0.94) after the training, particularly in the questions targeting the specifics of PCORI peer review. Reviewers improved their skills in recognising helpful review comments, but those without peer-review background improved proportionally more (p=0.008, median number of answers improved 2 (95% CI 1 to 3), medium ES w=0.60). The use of training modestly increased reviewers' confidence in completing a high-quality peer review (p=0.005, mean increase in 5-point Likert rating 0.51 (95% CI 0.17 to 0.86), small-to-medium ES Cliff's delta=0.32) and their excitement about providing a review slightly increased (p=0.019, mean increase in 5-point Likert rating 0.35 (95% CI 0.03 to 0.68), small ES delta=0.19). All reviewers were satisfied with the training and would recommend it to other reviewers.
    CONCLUSIONS: Training improved knowledge, skills and self-efficacy and slightly increased enthusiasm for completing a PCORI peer review.
    Keywords:  PCORI; Patient education; Patient peer review; Peer review; Training patients
    DOI:  https://doi.org/10.1136/bmjopen-2018-028732
  5. BMJ Open. 2019 Sep 24. 9(9): e031767
      INTRODUCTION: The adaptation of guidelines is an increasingly used methodology for the efficient development of contextualised recommendations. Nevertheless, there is no specific reporting guidance. The essential Reporting Items of Practice Guidelines in Healthcare (RIGHT) statement could be useful for reporting adapted guidelines, but it does not address all the important aspects of the adaptation process. The objective of our project is to develop an extension of the RIGHT statement for the reporting of adapted guidelines (RIGHT-Ad@pt Checklist).METHODS AND ANALYSIS: To develop the RIGHT-Ad@pt Checklist, we will use a multistep process that includes: (1) establishment of a Working Group; (2) generation of an initial checklist based on the RIGHT statement; (3) optimisation of the checklist (an initial assessment of adapted guidelines, semistructured interviews, a Delphi consensus survey, an external review by guideline developers and users and a final assessment of adapted guidelines); and (4) approval of the final checklist. At each step of the process, we will calculate absolute frequencies and proportions, use content analysis to summarise and draw conclusions, discuss the results, draft a report and refine the checklist.
    ETHICS AND DISSEMINATION: We have obtained a waiver of approval from the Clinical Research Ethics Committee at the Hospital de la Santa Creu i Sant Pau (Barcelona, Spain). We will disseminate the RIGHT-Ad@pt Checklist by publishing into a peer-reviewed journal, presenting to relevant stakeholders and translating into different languages. We will continuously seek feedback from stakeholders, surveil new relevant evidence and, if necessary, update the checklist.
    Keywords:  Evidence-based medicine; guideline adaptation; guidelines as topic; practice guideline; quality; reporting standards
    DOI:  https://doi.org/10.1136/bmjopen-2019-031767
  6. Commun Biol. 2019 ;2 352
      The theme of this year's Peer Review Week, Quality in Peer Review, reflects both the necessity of peer review and the growing uncertainty about its role in scholarly publishing. We support peer review that aims to improve manuscripts through critical evaluation before publication.
    DOI:  https://doi.org/10.1038/s42003-019-0603-3
  7. BMJ Open. 2019 Sep 26. 9(9): e031259
      OBJECTIVES: To improve the trustworthiness of evidence, studies should be prospectively registered and research reports should adhere to existing standards. We aimed to systematically assess the degree to which endocrinology and internal medicine journals endorse study registration and reporting standards for randomised controlled trials (RCTs), systematic reviews (SRs) and observational studies (ObS). Additionally, we evaluated characteristics that predict endorsement of reporting or registration mechanism by these journals.DESIGN: Meta-epidemiological study.
    SETTING: Journals included in the 'Endocrinology and Metabolism' and 'General and Internal Medicine' 2017 Journal Citation Reports.
    PARTICIPANTS: Journals with an impact factor of ≥1.0, focused on clinical medicine, and those who publish RCTs, SRs and ObS were included.
    PRIMARY OUTCOMES: Requirement of adherence to reporting guideline and study registration as determined from the journals' author instructions.
    RESULTS: Of the 170 (82 endocrinology and 88 internal medicine) eligible journals, endorsing of reporting standards was the highest for RCTs, with 35 (43%) of endocrine journals and 55 (63%) of internal medicine journals followed by SRs, with 21 (26%) and 48 (55%), respectively, and lastly, by ObS with 41 (50%) of endocrine journals and 21 (24%) of internal medicine journals. In 78 (46%) journals RCTs were required to be registered and published in adherence to the Consolidated Standards of Reporting Trials statement. Only 11 (6%) journals required registration of SRs. Internal medicine journals were more likely to endorse reporting guidelines than endocrine journals except for Strengthening the Reporting of Observational Studies in Epidemiology. No other journal characteristic proved to be an independent predictor of reporting standard endorsement for RCTs besides trial registration.
    CONCLUSION: Our results highlight that study registration requirement and reporting guideline endorsement are suboptimal in internal medicine and endocrine journals. This malpractice may be further enhanced since endorsement does not imply enforcement, impairing the practice of evidence-based medicine.
    Keywords:  endocrinology; endorsement; internal medicine; registration; reporting guidelines
    DOI:  https://doi.org/10.1136/bmjopen-2019-031259
  8. J Clin Epidemiol. 2019 Sep 20. pii: S0895-4356(19)30272-0. [Epub ahead of print]
      OBJECTIVE: To estimate the proportion of secondary publications of randomized controlled trials (RCTs) that provide new results relative to the primary publication.STUDY DESIGN AND SETTING: We searched for RCTs published in 2014 in the five medical journals with the highest impact factors. Secondary publications for each primary publication were then identified by their registration number. The main outcome measure was the proportion of secondary publications providing results already reported in the primary publication and/or non-prespecified analyses and/or a meta-analysis pooling results of studies not identified by systematic review.
    RESULTS: A total of 144 primary publications were identified; 94 (65%) had at least one secondary publication within 30 months after a primary publication. Of the secondary publications, 20% reported only results present in the primary publication, and 35% reported results not prespecified or pooled analyses not based on a systematic review. Factors associated with having at least one secondary publication were a large number of randomized trial participants (odds ratio [95% confidence interval]: 3.2 [1.1-9.3] for trials with >1000 vs ≤500 participants), investigating a biologic product (4.8 [1.4‒16.3] vs a non-biologic product) and cardiologic field vs. other fields (7.6 [1.46-39.8]).
    CONCLUSION: Most drug RCTs with results published in high-impact-factor journals had secondary publications. More than half of these secondary publications provided results already reported in the primary publication or results of non-prespecified analyses.
    Keywords:  Medical writing; Multiplicity; Outcome; Randomized Controlled trial; Secondary findings; Secondary publications
    DOI:  https://doi.org/10.1016/j.jclinepi.2019.09.012
  9. Proc Natl Acad Sci U S A. 2019 Sep 24. 116(39): 19231-19236
      Trust in science increases when scientists and the outlets certifying their work honor science's norms. Scientists often fail to signal to other scientists and, perhaps more importantly, the public that these norms are being upheld. They could do so as they generate, certify, and react to each other's findings: for example, by promoting the use and value of evidence, transparent reporting, self-correction, replication, a culture of critique, and controls for bias. A number of approaches for authors and journals would lead to more effective signals of trustworthiness at the article level. These include article badging, checklists, a more extensive withdrawal ontology, identity verification, better forward linking, and greater transparency.
    Keywords:  scientific integrity; signaling trustworthiness; transparency
    DOI:  https://doi.org/10.1073/pnas.1913039116
  10. F1000Res. 2018 ;7 501
      In this article, we review the literature on the benefits, and possible downsides, of openness in engineering research. We attempt to examine the issue from multiple perspectives, including reasons and motivations for introducing open practices into an engineering researcher's workflow and the challenges faced by scholars looking to do so. Further, we present our thoughts and reflections on the role that open engineering research can play in defining the purpose and activities of the university. We have made some specific recommendations on how the public university can recommit to and push the boundaries of its role as the creator and promoter of public knowledge. In doing so, the university will further demonstrate its vital role in the continued economic, social, and technological development of society. We have also included some thoughts on how this applies specifically to the field of engineering and how a culture of openness and sharing within the engineering community can help drive societal development.
    Keywords:  engineering; open access; open engineering; open science; research dissemination
    DOI:  https://doi.org/10.12688/f1000research.14593.1
  11. Nature. 2019 Sep;573(7775): 495
      
    Keywords:  Environmental sciences; Publishing
    DOI:  https://doi.org/10.1038/d41586-019-02856-2