bims-skolko Biomed News
on Scholarly communication
Issue of 2024–11–10
seventeen papers selected by
Thomas Krichel, Open Library Society



  1. Nature. 2024 Nov 05.
      
    Keywords:  Authorship; Publishing; Research data; Research management
    DOI:  https://doi.org/10.1038/d41586-024-03494-z
  2. Arch Soc Esp Oftalmol (Engl Ed). 2024 Oct 31. pii: S2173-5794(24)00175-0. [Epub ahead of print]
      
    DOI:  https://doi.org/10.1016/j.oftale.2024.10.009
  3. Nature. 2024 Nov 05.
      
    Keywords:  Machine learning; Publishing; Research data
    DOI:  https://doi.org/10.1038/d41586-024-03542-8
  4. Account Res. 2024 Nov 04. 1-9
       BACKGROUND: In this paper, we explore the question "Why can't AI be a coauthor?" and reveal a rarely discussed reason.
    METHODS AND RESULTS: First, allowing AI to be a coauthor disregards the uniquely human experience of writing texts. This means that human authors are seen as mere producers of texts rather than rational beings engaged in a value-added and humanized learning process expressed through the paper. The relationship between the human author and the thesis is reduced to a mere result of generation rather than a result of individual human critical thinking. Second, allowing AI to be a coauthor leads to self-delusion about one's own rationality and thus violates the responsibility to understand the world correctly. In this process of self-deception, it is not as if those who grant AI coauthor status do not realize that AI is not the same as humans; however, they self-deceivingly assume that AI has the same internal state as humans. This means that the relationship between the author and the work is no longer seen as a position to be respected, but as something probabilistic and gamified.
    CONCLUSIONS: Finally, we discuss the potential consequences of these rationales, concluding that including AI as a coauthor implies a disregard for humanization.
    Keywords:  AI coauthorship; authorship ethics; humanization; self-deception
    DOI:  https://doi.org/10.1080/08989621.2024.2420812
  5. Nature. 2024 Nov;635(8037): 10
      
    Keywords:  Computer science; Conferences and meetings; Machine learning; Publishing
    DOI:  https://doi.org/10.1038/d41586-024-03588-8
  6. Pharmacoepidemiol Drug Saf. 2024 Nov;33(11): e70045
      
    Keywords:  peer review; real‐world data; replicability; reproducibility; transparency
    DOI:  https://doi.org/10.1002/pds.70045
  7. Sci Rep. 2024 11 04. 14(1): 26626
      Transparency within biomedical research is essential for research integrity, credibility, and reproducibility. To increase adherence to optimal scientific practices and enhance transparency, we propose the creation of a journal transparency tool (JTT) that will allow users to obtain information about a given scholarly journal's operations and transparency policies. This study is part of a program of research to obtain user preferences to inform the proposed JTT. Here, we report on our consultation with clinicians and researchers. This mixed-methods study was conducted in two parts. The first part involved a cross-sectional survey conducted on a random sample of authors from biomedical journals. The survey asked clinicians and researchers about the inclusion of a series of potential scholarly metrics and user features in the proposed JTT. Quantitative survey items were summarized with descriptive statistics. Thematic content analysis was employed to analyze text-based responses. Subsequent focus groups used survey responses to further explore the inclusion of items in the JTT. Items with less than 70% agreement were used to structure discussion points during these sessions. Participants voted on the use of user features and metrics to be considered within the journal tool after each discussion. Thematic content analysis was conducted on interview transcripts to identify the core themes discussed. A total of 632 participants (5.5% response rate) took part in the survey. A collective total of 74.7% of respondents found it either 'occasionally, 'often', or 'almost always' difficult to determine if health information online is based on reliable research evidence. Twenty-two participants took part in the focus groups. Three user features and five journal tool metrics were major discussion points during these sessions. Thematic analysis of interview transcripts resulted in six themes. The use of registration was the only item to not meet the 70% threshold after both the survey and focus groups. Participants demonstrated low scholarly communication literacy when discussing tool metric suggestions. Our findings suggest that the JTT would be valuable for both researchers and clinicians. The outcomes of this research will contribute to developing and refining the tool in accordance with researchers and clinicians.
    Keywords:  Clinician; Health literacy; Journal metrics; Journal transparency tool; Researcher; Transparency
    DOI:  https://doi.org/10.1038/s41598-024-77790-z
  8. PLoS Biol. 2024 Nov;22(11): e3002870
      We conducted an international cross-sectional survey of biomedical researchers' perspectives on the reproducibility of research. This study builds on a widely cited 2016 survey on reproducibility and provides a biomedical-specific and contemporary perspective on reproducibility. To sample the community, we randomly selected 400 journals indexed in MEDLINE, from which we extracted the author names and emails from all articles published between October 1, 2020 and October 1, 2021. We invited participants to complete an anonymous online survey which collected basic demographic information, perceptions about a reproducibility crisis, perceived causes of irreproducibility of research results, experience conducting reproducibility studies, and knowledge of funding and training for research on reproducibility. A total of 1,924 participants accessed our survey, of which 1,630 provided useable responses (response rate 7% of 23,234). Key findings include that 72% of participants agreed there was a reproducibility crisis in biomedicine, with 27% of participants indicating the crisis was "significant." The leading perceived cause of irreproducibility was a "pressure to publish" with 62% of participants indicating it "always" or "very often" contributes. About half of the participants (54%) had run a replication of their own previously published study while slightly more (57%) had run a replication of another researcher's study. Just 16% of participants indicated their institution had established procedures to enhance the reproducibility of biomedical research and 67% felt their institution valued new research over replication studies. Participants also reported few opportunities to obtain funding to attempt to reproduce a study and 83% perceived it would be harder to do so than to get funding to do a novel study. Our results may be used to guide training and interventions to improve research reproducibility and to monitor rates of reproducibility over time. The findings are also relevant to policy makers and academic leadership looking to create incentives and research cultures that support reproducibility and value research quality.
    DOI:  https://doi.org/10.1371/journal.pbio.3002870
  9. Medicine (Baltimore). 2024 Nov 01. 103(44): e40259
      Open science practices aim to increase transparency in research and increase research availability through open data, open access platforms, and public access. Due to the increasing popularity of complementary, alternative, and integrative medicine (CAIM) research, our study aims to explore current open science practices and perceived barriers among CAIM researchers in their own respective research articles. We conducted an international cross-sectional online survey that was sent to authors that published articles in MEDLINE-indexed journals categorized under the broad subject of "Complementary Therapies" or articles indexed under the MeSH term "Complementary Therapies." Articles were extracted to obtain the names and emails of all corresponding authors. Eight thousand seven hundred eighty-six researchers were emailed our survey, which included questions regarding participants' familiarity with open science practices, their open science practices, and perceived barriers to open science in CAIM with respect to participants' most recently published article. Basic descriptive statistics was generated based on the quantitative data. The survey was completed by 292 participants (3.32% response rate). Results indicate that the majority of participants were "very familiar" (n = 83, 31.68%) or "moderately familiar" (n = 83, 31.68%) with the concept of open science practices while creating their study. Open access publishing was the most familiar to participants, with 51.96% (n = 136) of survey respondents publishing with open access. Despite participants being familiar with other open science practices, the actual implementation of these practices was low. Common barriers participants experienced in implementing open science practices include not knowing where to share the study materials, where to share the data, or not knowing how to make a preprint. Although participants responded that they were familiar with the concept of open science practices, the actual implementation and uses of these practices were low. Barriers included a lack of overall knowledge about open science, and an overall lack of funding or institutional support. Future efforts should aim to explore how to implement methods to improve open science training for CAIM researchers.
    DOI:  https://doi.org/10.1097/MD.0000000000040259
  10. Digit Health. 2024 Jan-Dec;10:10 20552076241290955
       Objective: To examine the way African health researchers share data. It summarized the types of data collected, the data sharing platforms, and how the geographical distribution of the African-based health researchers influenced data sharing practices. Ethical, legal, and social aspects were considered. Institutional and government matters such as research support and funding were identified.
    Methods: PubMed, Web of Science, LILAC, African Journal Archive, and Scopus databases were searched. Full-text screening was conducted, and data was extracted using the data extraction tool published in an a priori Joanna Briggs Institute-published protocol. Discrepancies were resolved by consensus. Data were illustrated using a Preferred Reporting Items for Systematic Reviews and Meta-analyses flow diagram, figures, tables, and a narrative text.
    Results: Of the 3869 studies that were identified, 32 studies were included in the final study. There was a spike in the number of published studies from 2015 to 2019 (n = 24, 75.0%), while a decline followed in the number of publications from 2020 to April 2023 (n = 6, 18.8%). Ten of the studies included were from South Africa, five were from Kenya, three each were from Nigeria and Tanzania, two were from Ghana and Sierra Leone respectively, while one each was from Malawi, Ethiopia, Cameroon, Mali, Gambia, Senegal, and Burkina Faso. Negative factors impacting data sharing practices of health researchers in Africa included barriers to individual research capacity, governmental bureaucracy and corruption, legal obstacles, technological problems, prohibitive costs of publication, lack of funding, institutional delays, and ethical issues.
    Conclusion: This review identified how African health researchers undertook data sharing in their countries. It pinpointed how geographical location and the resultant challenges to data distribution both individually and institutionally influenced health researchers' ability to achieve data sharing and publication of their research. It was clear that many parts of Africa are still not participating in research due to the many factors that negatively impact health data sharing in Africa.
    Keywords:  Africa; Data sharing; health; information dissemination; open science
    DOI:  https://doi.org/10.1177/20552076241290955
  11. Micron. 2024 Oct 19. pii: S0968-4328(24)00144-6. [Epub ahead of print]188 103727
      It has been ten years since the Editors of Micron contributed an editorial offering tips and tricks to help potential contributors navigate the editorial process and improve their chances of manuscript acceptance. In this contribution, we report how the updated guidelines have positively impacted the journal's content and performance over time, and provide new recommendations and insights to ensure your submission receives the attention it deserves from our editorial team.
    Keywords:  Article transfer service; G08 microscopy journals; Graphical abstract; Journal performance; Manuscript submission; Microscopy science; Multidisciplinary journal; Publishing; Readership; Reviewers; Special issues
    DOI:  https://doi.org/10.1016/j.micron.2024.103727