bims-skolko Biomed News
on Scholarly communication
Issue of 2021–05–09
twenty papers selected by
Thomas Krichel, Open Library Society



  1. Scientometrics. 2021 Apr 26. 1-19
      Methodological mistakes, data errors, and scientific misconduct are considered prevalent problems in science that are often difficult to detect. In this study, we explore the potential of using data from Twitter for discovering problems with publications. In this case study, we analyzed tweet texts of three retracted publications about COVID-19 (Coronavirus disease 2019)/SARS-CoV-2 (severe acute respiratory syndrome coronavirus 2) and their retraction notices. We did not find early warning signs in tweet texts regarding one publication, but we did find tweets that casted doubt on the validity of the two other publications shortly after their publication date. An extension of our current work might lead to an early warning system that makes the scientific community aware of problems with certain publications. Other sources, such as blogs or post-publication peer-review sites, could be included in such an early warning system. The methodology proposed in this case study should be validated using larger publication sets that also include a control group, i.e., publications that were not retracted.
    Keywords:  Altmetrics; COVID-19; Retracted papers; SARS-CoV-2; Scientometrics; Twitter
    DOI:  https://doi.org/10.1007/s11192-021-03962-7
  2. PeerJ Comput Sci. 2021 ;7 e445
       Background: Results of scientific experiments and research work, either conducted by individuals or organizations, are published and shared with scientific community in different types of scientific publications such as books, chapters, journals, articles, reference works and reference works entries. One aspect of these documents is their contents and the other is metadata. Metadata of scientific documents could be used to increase mutual cooperation, find people with common interest and research work, and to find scientific documents in the matching domains. The major issue in getting these benefits from metadata of scientific publications is availability of these data in unstructured (or semi-structured) format so that it can not be used to ask smart queries that can help in computing and performing different types of analysis on scientific publications data. Also, acquisition and smart processing of publications data is a complicated as well as time and resource consuming task.
    Methods: To address this problem we have developed a generic framework named as Linked Open Publications Data Framework (LOPDF). The LOPDF framework can be used to crawl, process, extract and produce machine understandable data (i.e., LOD) about scientific publications from different publisher specific sources such as portals, XML export and websites. In this paper we present the architecture, process and algorithm that we developed to process textual publications data and to produce semantically enriched data as RDF datasets (i.e., open data).
    Results: The resulting datasets can be used to make smart queries by making use of SPARQL protocol. We also present the quantitative as well as qualitative analysis of our resulting datasets which ultimately can be used to compute the research behavior of organizations in rapidly growing knowledge society. Finally, we present the potential usage of producing and processing such open data of scientific publications and how results of performing smart queries on resulting open datasets can be used to compute the impact and perform different types of analysis on scientific publications data.
    Keywords:  Algorithms analysis; Digital libraries; Ontological reasoning; Open data
    DOI:  https://doi.org/10.7717/peerj-cs.445
  3. Brain Neurosci Adv. 2021 Jan-Dec;5:5 23982128211006574
      Brain and Neuroscience Advances has grown in tandem with the British Neuroscience Association's campaign to build Credibility in Neuroscience, which encourages actions and initiatives aimed at improving reproducibility, reliability and openness. This commitment to credibility impacts not only what the Journal publishes, but also how it operates. With that in mind, the Editorial Board sought the views of the neuroscience community on the peer review process, and on how they should respond to the Journal Impact Factor that will be assigned to Brain and Neuroscience Advances. In this editorial, we present the results of a survey of neuroscience researchers conducted in the autumn of 2020 and discuss the broader implications of our findings for the Journal and the neuroscience community.
    Keywords:  Journal impact factor; credibility in neuroscience; double-blind review; open review; peer review; responsible research metrics; transparent review
    DOI:  https://doi.org/10.1177/23982128211006574
  4. Int J Surg. 2021 May 03. pii: S1743-9191(21)00098-4. [Epub ahead of print] 105964
      
    Keywords:  Academic social networking; Author-level metrics; Medical advertising; Online research articles; Research metrics
    DOI:  https://doi.org/10.1016/j.ijsu.2021.105964
  5. F1000Res. 2021 ;10 100
      Background: Funded health research is being published in journals that many regard as "predatory", deceptive, and non-credible. We do not currently know whether funders provide guidance on how to select a journal in which to publish funded health research. Methods: We identified the largest 46 philanthropic, public, development assistance, public-private partnership, and multilateral funders of health research by expenditure, globally as well as four public funders from lower-middle income countries, from the list at https://healthresearchfunders.org. One of us identified guidance on disseminating funded research from each funders' website (August/September 2017), then extracted information about selecting journals, which was verified by another assessor. Discrepancies were resolved by discussion. Results were summarized descriptively. This research used publicly available information; we did not seek verification with funding bodies. Results: The majority (44/50) of sampled funders indicated funding health research. 38 (of 44, 86%) had publicly available information about disseminating funded research, typically called "policies" (29, 76%). Of these 38, 36 (95%) mentioned journal publication for dissemination of which 13 (36.11%) offer variable guidance on selecting a journal, all of which relate to the funder's open access mandate. Six funders (17%) outlined publisher requirements or features by which to select a journal. One funder linked to a document providing features of journals to look for (e.g. listed in the Directory of Open Access Journals) and to be wary of (e.g., no journal scope statement, uses direct and unsolicited marketing). Conclusions: Few funders provided guidance on how to select a journal in which to publish funded research. Funders have a duty to ensure that the research they fund is discoverable by others. This research is a benchmark for funder guidance on journal selection prior to the January 2021 implementation of Plan S (a global, funder-led initiative to ensure immediate, open access to funded, published research).
    Keywords:  health research funders; journal selection; journals; publishing
    DOI:  https://doi.org/10.12688/f1000research.27745.1
  6. PLoS One. 2021 ;16(5): e0251176
       INTRODUCTION: In academia, many institutions use journal article publication productivity for making decisions on tenure and promotion, funding grants, and rewarding stellar scholars. Although non-alphabetical sequencing of article coauthoring by the spelling of surnames signals the extent to which a scholar has contributed to a project, many disciplines in academia follow the norm of alphabetical ordering of coauthors in journal publications. By assessing business academic publications, this study investigates the hypothesis that author alphabetical ordering disincentivizes teamwork and reduces the overall quality of scholarship.
    METHODS: To address our objectives, we accessed data from 21,353 articles published over a 20-year period across the four main business subdisciplines. The articles selected are all those published by the four highest-ranked journals (in each year) and four lower-ranked journals (in each year) for accounting, business technology, marketing, and organizational behavior. Poisson regression and binary logistic regression were utilized for hypothesis testing.
    RESULTS: This study finds that, although team size among business scholars is increasing over time, alphabetical ordering as a convention in journal article publishing disincentivizes author teamwork. This disincentive results in fewer authors per publication than for publications using contribution-based ordering of authors. Importantly, article authoring teamwork is related to article quality. Specifically, articles written by a single author typically are of lesser quality than articles published by coauthors, but the number of coauthors exhibits decreasing returns to scale-coauthoring teams of one to three are positively related to high-quality articles, but larger teams are not. Alphabetical ordering itself, however, is positively associated with quality even though it inhibits teamwork, but journal article coauthoring has a greater impact on article quality than does alphabetical ordering.
    CONCLUSIONS: These findings have important implications for academia. Scholars respond to incentives, yet alphabetical ordering of journal article authors conflicts with what is beneficial for the progress of academic disciplines. Based on these findings, we recommend that, to drive the highest-quality research, teamwork should be incentivized-all fields should adopt a contribution-based journal article author-ordering convention and avoid author ordering based upon the spelling of surnames. Although this study was undertaken using articles from business journals, its findings should generalize across all academia.
    DOI:  https://doi.org/10.1371/journal.pone.0251176
  7. Clin Transl Sci. 2021 May 07.
      Retractions of coronavirus disease 2019 (COVID-19) papers in high impact journals, such as The Lancet and the New England Journal of Medicine, have been panned as major scientific fraud in public media. The initial reaction to this news was to seek out scapegoats and blame individual authors, peer-reviewers, editors, and journals for wrong doing. This paper suggests that scapegoating a few individuals for faulty science is a myopic approach to the more profound problem with peer-review. Peer-review in its current limited form cannot be expected to adequately address the scope and complexity of large interdisciplinary science research collaboration, which is central in translational research. In addition, empirical studies on the effectiveness of traditional peer-review reveal its very real potential for bias and groupthink; as such, expectations regarding the capacity and effectiveness of the current peer review process are unrealistic. This paper proposes a new vision of peer-review in translational science that, on the one hand, would allow for early release of a manuscript to ensure expediency, whereas also creating a forum or a collective of various experts to actively comment, scrutinize, and even build on the research under review. The aim would be to not only generate open discussion and oversight respecting the quality and limitations of the research, but also to assess the extent and the means for that knowledge to translate into social benefit.
    DOI:  https://doi.org/10.1111/cts.13050
  8. Stud Hist Philos Sci. 2021 May 01. pii: S0039-3681(21)00032-7. [Epub ahead of print]88 1-9
      Both philosophers and scientists have recently promoted transparency as an important element of responsible scientific practice. Philosophers have placed particular emphasis on the ways that transparency can assist with efforts to manage value judgments in science responsibly. This paper examines a potential challenge to this approach, namely, that efforts to promote transparency can themselves be value-laden. This is particularly problematic when transparency incorporates second-order value judgments that are underwritten by the same values at stake in the desire for transparency about the first-order value judgments involved in scientific research. The paper uses a case study involving research on Lyme disease to illustrate this worry, but it responds by elucidating a range of scenarios in which transparency can still play an effective role in managing value judgments responsibly.
    Keywords:  Lyme disease; Open science; Science communication; Transparency; Values in science
    DOI:  https://doi.org/10.1016/j.shpsa.2021.03.008
  9. Ann Med Surg (Lond). 2020 Dec;60 140-145
       Background: Physician scientists who are also Editorial Board members or Associate Editors may prefer publishing in their own journal and therefore create an environment for conflicts of interest to arise.
    Objectives: To assess the relationship between the number of peer-reviewed publications in surgical journals in which authors serve as Editorial Board Members and Associate Editors and their total number of annual publications.
    Materials and methods: A cross-sectional study utilizing PubMed was performed regarding the total annual number of peer-reviewed publications by Editorial Board Members/Associate Editors and the number published in their respective affiliated journals from 2016 to 2019. Significance defined as p < 0.05.
    Results: 80 Associate Editors and 721 Editorial Board Members (n = 801 total) were analyzed from 10 surgical journals. The mean number of total annual peer-reviewed publications varied from 5.19 to 17.18. The mean number of annual peer-reviewed publications in affiliated journals varied from 0.06 to 2.53. Multiple significant associations were discovered between the total number of annual peer-reviewed publications and number of peer-reviewed publications in affiliated journals for all authors/surgical journals evaluated, except for the International Journal of Surgery (p > 0.05).
    Conclusions: We found significant associations between the total number of annual peer-reviewed publications by Editorial Board Members/Associate Editors and number of annual peer-reviewed publications by their affiliated surgical journals. The implementation and enforcement of a standardized double-blind review process and mandatory reporting of any potential conflicts of interest can reduce possible bias and promote a fair and high-quality peer-review process.
    Keywords:  Conflicts of interest; Double-blind peer review process; Editorial board membership; Research productivity; Surgical journals
    DOI:  https://doi.org/10.1016/j.amsu.2020.10.042
  10. PLoS One. 2021 ;16(5): e0250362
       OBJECTIVES: Publication bias, non-publication, and selective reporting of animal studies limit progress toward the 3Rs (Replacement, Reduction, and Refinement) that guide ethical animal testing, waste public resources, and result in redundant research, which collectively undermine the public's trust in scientific reliability. In this study, we aimed to 1) validate findings from a previous follow-up study by our team that examined the publication rates of animal studies from protocol to publication and 2) identify incentives for improving publication rates in animal research.
    METHODS: The researchers responsible for the animal proposals (n = 210) from our previous study were contacted as participants for a Web-based survey between October 2019 and April 2020. Question types varied between free text questions, answer options based on a 5-point Likert scale and closed yes/no questions.
    RESULTS: In total, 78 researchers responsible for 101 of 210 animal study proposals participated, yielding a response rate of 48.1%. Results showed that the publication rate increased from 67% in our follow-up study to 70%. According to a 5-point Likert scale (from 1 = "not relevant" to 5 = "extremely relevant"), the most widely accepted suggestions for increasing publication rates were "Publication costs for open access journals are fully covered by funders or universities" (mean 4.02, SD 1.01), "Performance-based allocation of intramural funds for results reporting of animal research not supporting the initial hypothesis (including preprints and repositories)" (mean 3.37, SD 1.05), and "Researchers receive more information from scientific journals that also publish non-significant results" (mean 3.30, SD 1.02).
    CONCLUSION: While the extent of publication and publication practices have been thoroughly investigated for clinical trials, less data is available for animal research to date. Therefore, the study contributes in complementing the picture of publication practice in animal research. Suggestions from our survey may help improve the publication rates of animal studies.
    DOI:  https://doi.org/10.1371/journal.pone.0250362
  11. Scientometrics. 2021 Apr 26. 1-16
      During the previous Ebola and Zika outbreaks, researchers shared their data, allowing many published epidemiological studies to be produced only from open research data, to speed up investigations and control of these infections. This study aims to evaluate the dissemination of the COVID-19 research data underlying scientific publications. Analysis of COVID-19 publications from December 1, 2019, to April 30, 2020, was conducted through the PubMed Central repository to evaluate the research data available through its publication as supplementary material or deposited in repositories. The PubMed Central search generated 5,905 records, of which 804 papers included complementary research data, especially as supplementary material (77.4%). The most productive journals were The New England Journal of Medicine, The Lancet and The Lancet Infectious Diseases, the most frequent keyword was pneumonia, and the most used repositories were GitHub and GenBank. An expected growth in the number of published articles following the course of the pandemics is confirmed in this work, while the underlying research data are only 13.6%. It can be deduced that data sharing is not a common practice, even in health emergencies, such as the present one. High-impact generalist journals have accounted for a large share of global publishing. The topics most often covered are related to epidemiological and public health concepts, genetics, virology and respiratory diseases, such as pneumonia. However, it is essential to interpret these data with caution following the evolution of publications and their funding in the coming months.
    Keywords:  COVID-19; Data sharing; PubMed central; Repository; Supplementary material
    DOI:  https://doi.org/10.1007/s11192-021-03971-6
  12. Syst Rev. 2021 May 03. 10(1): 131
      Published protocols have the potential to reduce bias in the conduct and reporting of systematic reviews (SR). When reporting the results of a completed SR, the question might arise whether text used in the protocol can also be used in the completed SR? Does this constitute text recycling, plagiarism, or even copyright infringement? In theory, no major changes to the protocol will be expected for the introduction and methods sections if the SR is completed in time. The benefits of maintaining the introduction and methods section of a protocol in the published SR are straightforward. Authors will require less time for writing up the completed SR. Potential benefits can also be expected for peer reviewers and editors. However, reusing text can be described as self-plagiarism. The question to be answered is whether this type of self-plagiarism is acceptable when copying text used previously (as would be the case when copying text from the protocol and pasting it into the subsequent completed SR)? The "traditional answer" to this question is "yes" because authors should not get credit for one piece of work for more than one time unless the work is cited appropriately. In contrast, we propose that in this context, it seems to be fully acceptable from a scientific and ethical perspective. As such, authors should not be accused of plagiarism in this case, but rather be encouraged to be efficient. However, legal issues need to be taken into consideration (e.g., copyright). We hope to stimulate a discussion on this topic among authors, readers, editors, and publishers.
    Keywords:  Meta-analysis; Plagiarism; Protocol; Publishing; Registration; Systematic review
    DOI:  https://doi.org/10.1186/s13643-021-01675-9
  13. Nutr Clin Pract. 2021 May 06.
      
    Keywords:  author guidelines; peer review; publishing; research report
    DOI:  https://doi.org/10.1002/ncp.10676
  14. Front Psychol. 2021 ;12 601849
      The American Educational Research Association and American Psychological Association published standards for reporting on research. The transparency of reporting measures and data collection is paramount for interpretability and replicability of research. We analyzed 57 articles that assessed alphabet knowledge (AK) using researcher-developed measures. The quality of reporting on different elements of AK measures and data collection was not related to the journal type nor to the impact factor or rank of the journal but rather seemed to depend on the individual author, reviewers, and journal editor. We propose various topics related to effective reporting of measures and data collection methods that we encourage the early childhood and literacy communities to discuss.
    Keywords:  alphabet knowledge; emergent literacy; evaluation; letter knowledge; research methodology
    DOI:  https://doi.org/10.3389/fpsyg.2021.601849
  15. IUCrJ. 2021 May 01. 8(Pt 3): 331-332
      Biomedical challenges such as the present COVID-19 pandemic require both good science and excellent communication between scientists and the general public. This underscores the importance of presenting our science in innovative ways that make it accessible to all.
    Keywords:  COVID-19; SARS-CoV-2; coronaviruses; editorial; scientific communication
    DOI:  https://doi.org/10.1107/S2052252521003894
  16. Can J Anaesth. 2021 May 07.
      Human beings are predisposed to identifying false patterns in statistical noise, a likely survival advantage during our evolutionary development. Moreover, humans seem to prefer "positive" results over "negative" ones. These two cognitive features lay a framework for premature adoption of falsely positive studies. Added to this predisposition is the tendency of journals to "overbid" for exciting or newsworthy manuscripts, incentives in both the academic and publishing industries that value change over truth and scientific rigour, and a growing dependence on complex statistical techniques that some reviewers do not understand. The purpose of this article is to describe the underlying causes of premature adoption and provide recommendations that may improve the quality of published science.
    Keywords:  anesthesia; apophenia; bias; incentives; premature adoption
    DOI:  https://doi.org/10.1007/s12630-021-02005-2
  17. Biosci Rep. 2021 May 04. pii: BSR20211016. [Epub ahead of print]
      As Bioscience Reports enters its fifth decade of continuous multi-disciplinary life science publishing, here we present a timely overview of the journal. In addition to introducing ourselves and new Associate Editors for 2021, we reflect on the challenges the new Editorial Board has faced and overcome since we took over the editorial leadership in June of 2020, and detail some key strategies on how we plan to encourage more submissions and broader readership for a better and stronger journal in the coming years.
    Keywords:  biochemical techniques and resources; biomarkers; cell cycle; genomics; molecular basis of health and disease
    DOI:  https://doi.org/10.1042/BSR20211016