bims-skolko Biomed News
on Scholarly communication
Issue of 2023‒02‒05
twenty-two papers selected by
Thomas Krichel
Open Library Society


  1. Clin Infect Dis. 2023 Jan 31. pii: ciad055. [Epub ahead of print]
      The language we use in our scientific communications can either empower or stigmatize the people we study and care for. Clinical Infectious Diseases is committed to prioritizing the use of inclusive, non-stigmatizing language in published manuscripts. We hereby call upon submitting authors, reviewers, and editors to do the same.
    Keywords:  Person-first language; inclusive; non-stigmatizing; scientific communications
    DOI:  https://doi.org/10.1093/cid/ciad055
  2. Dev World Bioeth. 2023 Feb 03.
      Predatory journals offer the promise of prompt publication to those willing to pay the article submission or processing fee. However, these journals do not offer rigorous peer review. Studies have shown that a substantial share of corresponding authors in predatory journals come from South Asia, particularly India. This scoping review aims to assess what is known about the reasons why healthcare researchers working in South Asia publish in predatory journals. 66 reports (14 editorials, 20 letters, 5 research reports, 10 opinion articles, 14 reviews, 2 commentaries and 1 news report) were included in the data charting and analysis. The analysis of the reports identified three main reasons that made South Asian healthcare researchers publish in predatory journals: pressure to publish, lack of research support, and pseudo benefits. The review shows that predatory publishing in South Asia is a complex phenomenon. Combating predatory publications requires a holistic strategy that supersedes merely blacklisting these journals or listing criteria for journals that do meet academic standards.
    Keywords:  Open Access; South Asia; predatory journals; publication ethics
    DOI:  https://doi.org/10.1111/dewb.12388
  3. Nature. 2023 Feb;614(7947): 224-226
      
    Keywords:  Computer science; Machine learning; Publishing; Research management
    DOI:  https://doi.org/10.1038/d41586-023-00288-7
  4. Neurointervention. 2023 Feb 01.
      In Korea, many editors of medical journal are also publishers; therefore, they need to not only manage peer review, but also understand current trends and policies in journal publishing and editing. This article aims to highlight some of these policies with examples. First, the use of artificial intelligence tools in journal publishing has increased, including for manuscript editing and plagiarism detection. Second, preprint publications, which have not been peer-reviewed, are becoming more common. During the COVID-19 pandemic, medical journals have been more willing to accept preprints to adjust rapidly changing pandemic health issues, leading to a significant increase in their use. Third, open peer review with reviewer comments is becoming more widespread, including the mandatory publication of peer-reviewed manuscripts with comments. Fourth, model text recycling policies provide guidelines for researchers and editors on how to appropriately recycle text, for example, in the background section of the Introduction or the Methods section. Fifth, journals should take into account the recently updated 4th version of the Principles of Transparency and Best Practice in Scholarly Publishing, released in 2022. This version includes more detailed guidelines on journal websites, peer review processes, advisory boards, and author fees. Finally, it recommends that titles of human studies include country names to clarify the cultural context of the research. Each editor must decide whether to adopt these six policies for their journals. Editor-publishers of society journals are encouraged to familiarize themselves with these policies so that they can implement them in their journals as appropriate.
    Keywords:  Artificial intelligence; Culture; Peer review; Policy; Scholarly communication
    DOI:  https://doi.org/10.5469/neuroint.2022.00493
  5. Br J Soc Psychol. 2023 Jan 31.
      In recent years, there has been a focus in social psychology on efforts to improve the robustness, rigour, transparency and openness of psychological research. This has led to a plethora of new tools, practices and initiatives that each aim to combat questionable research practices and improve the credibility of social psychological scholarship. However, the majority of these efforts derive from quantitative, deductive, hypothesis-testing methodologies, and there has been a notable lack of in-depth exploration about what the tools, practices and values may mean for research that uses qualitative methodologies. Here, we introduce a Special Section of BJSP: Open Science, Qualitative Methods and Social Psychology: Possibilities and Tensions. The authors critically discuss a range of issues, including authorship, data sharing and broader research practices. Taken together, these papers urge the discipline to carefully consider the ontological, epistemological and methodological underpinnings of efforts to improve psychological science, and advocate for a critical appreciation of how mainstream open science discourse may (or may not) be compatible with the goals of qualitative research.
    Keywords:  authorship; contributorship; interaction analysis; metascience; open data; open science; pre-registration; qualitative methods; qualitative social psychology; reproducibility
    DOI:  https://doi.org/10.1111/bjso.12628
  6. Psychol Sci. 2023 Feb 02. 9567976221140828
      In April 2019, Psychological Science published its first issue in which all Research Articles received the Open Data badge. We used that issue to investigate the effectiveness of this badge, focusing on the adherence to its aim at Psychological Science: sharing both data and code to ensure reproducibility of results. Twelve researchers of varying experience levels attempted to reproduce the results of the empirical articles in the target issue (at least three researchers per article). We found that all 14 articles provided at least some data and six provided analysis code, but only one article was rated to be exactly reproducible, and three were rated as essentially reproducible with minor deviations. We suggest that researchers should be encouraged to adhere to the higher standard in force at Psychological Science. Moreover, a check of reproducibility during peer review may be preferable to the disclosure method of awarding badges.
    Keywords:  data sharing; journal policy; open badges; open data; reproducibility
    DOI:  https://doi.org/10.1177/09567976221140828
  7. Nurse Educ. 2022 Nov 16.
      BACKGROUND: This article reports the findings from a pilot study of a peer review process used with a group of faculty champions who were writing items for a state-wide initiative to establish a Next Generation NCLEX item teaching test bank.METHODS: Champions were oriented to the peer review process in a face-to-face session and completed reviews using the Clinical Judgment Item Peer Review Form created for the project.
    RESULTS: Eighteen faculty from 13 different schools attended the session and completed 55 reviews of 40 cases and 35 stand-alone items. Champions took approximately an hour to complete each case study and related stand-alone item review and give actionable feedback.
    CONCLUSIONS: The peer review process benefits reviewers and authors learning to write Next Generation NCLEX questions. The process used in this project can be replicated by other faculty in their own programs.
    DOI:  https://doi.org/10.1097/NNE.0000000000001322
  8. Nature. 2023 Feb;614(7946): 34
      
    Keywords:  Ethics; Genetics; Peer review
    DOI:  https://doi.org/10.1038/d41586-023-00218-7
  9. Surgery. 2023 Feb;pii: S0039-6060(22)01076-5. [Epub ahead of print]173(2): 269
      
    DOI:  https://doi.org/10.1016/j.surg.2022.12.017
  10. PNAS Nexus. 2022 Mar;1(1): pgac016
      Preregistration of studies is a recognized tool in clinical research to improve the quality and reporting of all gained results. In preclinical research, preregistration could boost the translation of published results into clinical breakthroughs. When studies rely on animal testing or form the basis of clinical trials, maximizing the validity and reliability of research outcomes becomes in addition an ethical obligation. Nevertheless, the implementation of preregistration in animal research is still slow. However, research institutions, funders, and publishers start valuing preregistration, and thereby level the way for its broader acceptance in the future. A total of 3 public registries, the OSF registry, preclinicaltrials.eu, and animalstudyregistry.org already encourage the preregistration of research involving animals. Here, they jointly declare common standards to make preregistration a valuable tool for better science. Registries should meet the following criteria: public accessibility, transparency in their financial sources, tracking of changes, and warranty and sustainability of data. Furthermore, registration templates should cover a minimum set of mandatory information and studies have to be uniquely identifiable. Finally, preregistered studies should be linked to any published outcome. To ensure that preregistration becomes a powerful instrument, publishers, funders, and institutions should refer to registries that fulfill these minimum standards.
    Keywords:  3R; animal research; open science; preclinical research; preregistration; research methods
    DOI:  https://doi.org/10.1093/pnasnexus/pgac016
  11. J Food Sci. 2023 Feb;88(2): 578
      
    DOI:  https://doi.org/10.1111/1750-3841.16483