bims-skolko Biomed News
on Scholarly communication
Issue of 2020‒11‒22
twenty-one papers selected by
Thomas Krichel
Open Library Society


  1. Elife. 2020 Nov 19. pii: e62529. [Epub ahead of print]9
      Peer review practices differ substantially between journals and disciplines. This study presents the results of a survey of 322 editors of journals in ecology, economics, medicine, physics and psychology. We found that 49% of the journals surveyed checked all manuscripts for plagiarism, that 61% allowed authors to recommend both for and against specific reviewers, and that less than 2% used a form of open peer review. Most journals did not have an official policy on altering reports from reviewers, but 91% of editors identified at least one situation in which it was appropriate for an editor to alter a report. Editors were also asked for their views on five issues related to publication ethics. A majority expressed support for co-reviewing, reviewers requesting access to data, reviewers recommending citations to their work, editors publishing in their own journals, and replication studies. Our results provide a window into what is largely an opaque aspect of the scientific process. We hope the findings will inform the debate about the role and transparency of peer review in scholarly publishing.
    Keywords:  none
    DOI:  https://doi.org/10.7554/eLife.62529
  2. R Soc Open Sci. 2020 Oct;7(10): 201520
      Preprints increase accessibility and can speed scholarly communication if researchers view them as credible enough to read and use. Preprint services do not provide the heuristic cues of a journal's reputation, selection, and peer-review processes that, regardless of their flaws, are often used as a guide for deciding what to read. We conducted a survey of 3759 researchers across a wide range of disciplines to determine the importance of different cues for assessing the credibility of individual preprints and preprint services. We found that cues related to information about open science content and independent verification of author claims were rated as highly important for judging preprint credibility, and peer views and author information were rated as less important. As of early 2020, very few preprint services display any of the most important cues. By adding such cues, services may be able to help researchers better assess the credibility of preprints, enabling scholars to more confidently use preprints, thereby accelerating scientific communication and discovery.
    Keywords:  credibility; preprints; trust
    DOI:  https://doi.org/10.1098/rsos.201520
  3. Med Health Care Philos. 2020 Nov 20.
      Retractions of COVID-19 literature in both preprints and the peer-reviewed literature serve as a reminder that there are still challenging issues underlying the integrity of the biomedical literature. The risks to academia become larger when such retractions take place in high-ranking biomedical journals. In some cases, retractions result from unreliable or nonexistent data, an issue that could easily be avoided by having open data policies, but there have also been retractions due to oversight in peer review and editorial verification. As COVID-19 continues to affect academics and societies around the world, failures in peer review might also constitute a public health risk. The effectiveness by which COVID-19 literature is corrected, including through retractions, depends on the stringency of measures in place to detect errors and to correct erroneous literature. It also relies on the stringent implementation of open data policies.
    Keywords:  Academic quality; Correction; Public health risk; Retraction; Type I and II errors; Withdrawal
    DOI:  https://doi.org/10.1007/s11019-020-09990-z
  4. J Am Board Fam Med. 2020 Nov-Dec;33(6):33(6): 986-991
      PURPOSE: To assess the reliability of peer review of abstracts submitted to academic family medicine meetings in North America.METHODS: We analyzed reviewer ratings of abstracts submitted: 1) as oral presentations to the North American Primary Care Research Group (NAPCRG) meeting from 2016 to 2019, as well as 2019 poster session or workshop submissions; and 2) in 12 categories to the Society of Teachers of Family Medicine (STFM) Spring 2018 meeting. In each category and year, we used a multi-level mixed model to estimate the abstract-level intraclass correlation coefficient (ICC) and the reliability of initial review (using the abstract-level ICC and the number of reviewers per abstract).
    RESULTS: We analyzed review data for 1554 NAPCRG oral presentation abstracts, 418 NAPCRG poster or workshop abstracts, and 1145 STFM abstracts. Across all years, abstract-level ICCs for NAPCRG oral presentations were below 0.20 (range, 0.10 in 2019 to 0.18 in 2016) and were even lower for posters and workshops (range, 0.00-0.10). After accounting for the number of reviewers per abstract, reliabilities of initial review for NAPCRG oral presentations ranged from 0.24 in 2019 to 0.30 in 2016 and 0.00 to 0.18 for posters and workshops in 2019. Across 12 STFM submission categories, the median abstract-level ICC was 0.21 (range, 0.12-0.50) and the median reliability was 0.42 (range, 0.25-0.78).
    CONCLUSIONS: For abstracts submitted to North American academic family medicine meetings, inter-reviewer agreement is often low, compromising initial review reliability. For many submission categories, program committees should supplement initial review with independent postreview assessments.
    Keywords:  Abstracting and Indexing; Biostatistics; Faculty; Observer Variation; Peer Review; Primary Health Care
    DOI:  https://doi.org/10.3122/jabfm.2020.06.200123
  5. Curr Med Res Opin. 2020 Nov 18. 1
      Aim: Non-peer-reviewed manuscripts posted as preprints can be cited in peer-reviewed articles which has both merits and demerits. International Committee of Medical Journal Editors guidelines mandate authors to declare preprints at the time of manuscript submission. We evaluated the trends in pharma-authored research published as preprints and their scientific and social media impact by analyzing citation rates and altmetrics. Research design and methods: We searched EuroPMC, PrePubMed bioRxiv and MedRxiv for preprints submitted by authors affiliated with the top 50 pharmaceutical companies from inception till June 15, 2020. Data were extracted and analyzed from the search results. The number of citations for the preprint and peer-reviewed versions (if available) were compiled using the Publish or Perish software (version 1.7). Altmetric score was calculated using the "Altmetric it" online tool. Statistical significance was analyzed by Wilcoxon rank-sum test. Results: A total of 498 preprints were identified across bioRxiv (83%), PeerJ (5%), F1000Research (6%), Nature Proceedings (3%), Preprint.org (3%), Wellcome Open Research preprint (0.2%) and MedRxiv (0.2%) servers. Roche, Sanofi and Novartis contributed 56% of the retrieved preprints. The median number of citations for the included preprints was 0 (IQR =1, Min-Max =0-45). The median number of citations for the published preprints and unpublished preprints was 0 for both (IQR =1, Min-Max =0-25 and IQR =1, Min-Max =0-45, respectively; P = .091). The median Altmetric score of the preprints was 4 (IQR =10.5, Min-Max =0-160). Conclusion: Pharma-authored research is being increasingly published as preprints and is also being cited in other peer-reviewed publications and discussed in social media.
    Keywords:  Guidelines; Medical communications; Metrics
    DOI:  https://doi.org/10.1080/03007995.2020.1853083
  6. Account Res. 2020 Nov 18.
      The current system for assessing and publicly notifying concerns about publication integrity is slow, inefficient, inconsistent, inadequate and opaque. Readers are therefore left unaware of potential issues about publications or are given inadequate information to assess publication integrity. We propose a new process for dealing with publication integrity involving the establishment of independent panel(s) that assess publication integrity and transparently report the outcomes of those assessments, independent from assessment of any misconduct.
    Keywords:  Expression of Concern; Publication integrity; Retraction
    DOI:  https://doi.org/10.1080/08989621.2020.1852938
  7. R Soc Open Sci. 2020 Oct;7(10): 200834
      Science is self-correcting, or so the adage goes, but to what extent is that indeed the case? Answering this question requires careful consideration of the various approaches to achieve the collective goal of self-correction. One of the most straightforward mechanisms is individual self-correction: researchers rectifying their own mistakes by publishing a correction notice. Although it offers an efficient route to correcting the scientific record, it has received little to no attention from a metascientific point of view. We aim to fill this void by analysing the content of correction notices published from 2010 until 2018 in the three psychology journals featuring the highest number of corrections over that timespan based on the Scopus database (i.e. Psychological Science with N = 58, Frontiers in Psychology with N = 99 and Journal of Affective Disorders with N = 57). More concretely, we examined which aspects of the original papers were affected (e.g. hypotheses, data-analyses, metadata such as author order, affiliations, funding information etc.) as well as the perceived implications for the papers' main findings. Our exploratory analyses showed that many corrections involved inconsequential errors. Furthermore, authors rarely revised their conclusions, even though several corrections concerned changes to the results. We conclude with a discussion of current policies, and suggest ways to improve upon the present situation by (i) preventing mistakes, and (ii) transparently rectifying those mistakes that do find their way into the literature.
    Keywords:  metascience; publication practices; self-correction
    DOI:  https://doi.org/10.1098/rsos.200834
  8. Nature. 2020 Nov 20.
      
    Keywords:  Careers; Conferences and meetings; Events
    DOI:  https://doi.org/10.1038/d41586-020-03300-6
  9. Health Psychol Rev. 2020 Nov 19. 1-17
      The article describes a position statement and recommendations for actions that need to be taken to develop best practices for promoting scientific integrity through open science in health psychology endorsed at a Synergy Expert Group Meeting. Sixteen Synergy Meeting participants developed a set of recommendations for researchers, gatekeepers, and research end-users. The group process followed a nominal group technique and voting system to elicit and decide on the most relevant and topical issues. Seventeen priority areas were listed and voted on, 15 of them were recommended by the group. Specifically, the following priority actions for health psychology were endorsed: (1) for researchers: advancing when and how to make data open and accessible at various research stages and understanding researchers' beliefs and attitudes regarding open data; (2) for educators: integrating open science in research curricula, e.g., through online open science training modules, promoting preregistration, transparent reporting, open data and applying open science as a learning tool; (3) for journal editors: providing an open science statement, and open data policies, including a minimal requirements submission checklist. Health psychology societies and journal editors should collaborate in order to develop a coordinated plan for research integrity and open science promotion across behavioural disciplines.
    Keywords:  Open science; health psychology; integrity; open access; replication
    DOI:  https://doi.org/10.1080/17437199.2020.1844037
  10. Proc Math Phys Eng Sci. 2020 Oct;476(2242): 20200746
      At Proceedings of the Royal Society A, something we are always concerned and vigilant about is publication malpractice. This editorial examines the background to some small changes to our reviewer forms that will help us in identifying patterns of worrying behaviour. The importance of this in the context of the relationship of science to policy-making and the public perception of science is stressed.
    Keywords:  bibliometrics; citations; ethics
    DOI:  https://doi.org/10.1098/rspa.2020.0746
  11. Int J Surg. 2020 Nov 12. pii: S1743-9191(20)30779-2. [Epub ahead of print]
      INTRODUCTION: The PROCESS Guidelines were first published in 2016 and were last updated in 2018. They provide a structure for reporting surgical case series in order to increase reporting robustness and transparency, and are used and endorsed by authors, journal editors and reviewers alike. In order to drive forwards reporting quality, they must be kept up to date. As such, we have updated these guidelines via a DELPHI consensus exercise.METHODS: The updated guidelines were produced via a DELPHI consensus exercise. Members from the previous DELPHI group were again invited, alongside editorial board member and peer reviewers of the International Journal of Surgery and the International Journal of Surgery Case Reports. An online survey was completed by this expert group to indicate their agreement with proposed changes to the checklist items.
    RESULTS: A total of 53 surgical experts agreed to participate and 49 (92%) completed the survey. The responses and suggested modifications were incorporated into the previous 2018 guidelines. There was a high degree of agreement amongst the PROCESS Group, with all but one of the PROCESS Items receiving over 70% of scores ranging 7-9.
    CONCLUSION: A DELPHI consensus exercise was completed, and an updated and improved PROCESS Checklist is now presented.
    Keywords:  PROCESS; Surgery; case series; guideline
    DOI:  https://doi.org/10.1016/j.ijsu.2020.11.005
  12. Wilderness Environ Med. 2020 Nov 12. pii: S1080-6032(20)30171-X. [Epub ahead of print]
      
    DOI:  https://doi.org/10.1016/j.wem.2020.09.005
  13. Nature. 2019 Nov 20.
      
    Keywords:  Publishing; Research management
    DOI:  https://doi.org/10.1038/d41586-019-03558-5
  14. Eur Heart J Case Rep. 2020 Oct;4(5): 1-5
      Background: Case reports are subject to significant variation in their content, and the absence of pertinent case details can limit their benefit to the medical community. To aid this, a reporting standard (CARE) has been developed. Case reports published in European Heart Journal - Case reports (EHJ-CR) are subject to specific checks by editors to confirm compliance with the CARE reporting standard. However, a degree to which case reports published by EHJ-CR comply with the CARE reporting standards has not been established.Methods: Case reports published in EHJ-CR during 2018 were reviewed for compliance with the CARE reporting standards. Two authors assessed each article for compliance with each of the 31 criteria.
    Results: In 2018, 130 case reports/series were published by EHJ-CR. The median number of CARE criteria achieved by each article was 21 (interquartile range 21-25) out of 31. CARE criteria with the highest adherence were timeline inclusion, a clear and well-referenced discussion, and declaration of competing interests, all present in 100% of articles. In contrast, some aspects were poorly adhered to including patient perspective, and details of funding sources. There was no difference in overall compliance with aspects of the CARE standard between diagnostic and interventional case reports. However, lower compliance was seen for the discussion of diagnostic challenges in interventional studies (19%), when compared to diagnostic studies (44%). The continent of authorship and month submitted did not affect CARE adherence.
    Conclusions: There was good compliance with the CARE reporting standards by case reports published in EHJ-CR. A number of specific areas for improvement have been identified which will be considered by the editorial board of EHJ-CR.
    Keywords:  Audit; CARE; Case reports; Reporting standards
    DOI:  https://doi.org/10.1093/ehjcr/ytaa251
  15. Mediterr J Rheumatol. 2020 Sep;31(Suppl 2): 243-246
      The flow of information on Coronavirus Disease 2019 (COVID-19) is intensifying, requiring concerted efforts of all scholars. Peer-reviewed journals as established channels of scientific communications are struggling to keep up with unprecedented high submission rates. Preprint servers are becoming increasingly popular among researchers and authors who set priority over their ideas and research data by pre-publication archiving of their manuscripts on these professional platforms. Most published articles on COVID-19 are now archived by the PubMed Central repository and available for searches on LitCovid, which is a newly designed hub for specialist searches on the subject. Social media platforms are also gaining momentum as channels for rapid dissemination of COVID-19 information. Monitoring, evaluating and filtering information flow through the established and emerging scholarly platforms may improve the situation with the pandemic and save lives.
    Keywords:  COVID-19; hydroxychloroquine; information; periodicals as topic; retractions; social media
    DOI:  https://doi.org/10.31138/mjr.31.3.243
  16. Patterns (N Y). 2020 Apr 10. 1(1): 100007
      The Scholexplorer API, based on the Scholix (Scholarly Link eXchange) framework, aims to identify links between articles and supporting data. This quantitative case study demonstrates that the API vastly expanded the number of datasets previously known to be affiliated with University of Bath outputs, allowing improved monitoring of compliance with funder mandates by identifying peer-reviewed articles linked to at least one unique dataset. Availability of author names for research outputs increased from 2.4% to 89.2%, which enabled identification of ten articles reusing non-Bath-affiliated datasets published in external repositories in the first phase, giving valuable evidence of data reuse and impact for data producers. Of these, only three were formally cited in the references. Further enhancement of the Scholix schema and enrichment of Scholexplorer metadata using controlled vocabularies would be beneficial. The adoption of standardized data citations by journals will be critical to creating links in a more systematic manner.
    Keywords:  Scholexplorer; Scholix; data publication; data repository; data reuse; data sharing; research data; research data management; research impact; scholarly communication
    DOI:  https://doi.org/10.1016/j.patter.2020.100007
  17. J Obstet Gynaecol Can. 2020 Nov 12. pii: S1701-2163(20)30904-X. [Epub ahead of print]
      
    DOI:  https://doi.org/10.1016/j.jogc.2020.11.006