bims-skolko Biomed News
on Scholarly communication
Issue of 2022‒02‒20
twenty-two papers selected by
Thomas Krichel
Open Library Society


  1. Nature. 2022 Feb 16.
      
    Keywords:  Developing world; Publishing
    DOI:  https://doi.org/10.1038/d41586-022-00342-w
  2. BMJ Glob Health. 2022 Feb;pii: e008059. [Epub ahead of print]7(2):
      INTRODUCTION: Health researchers from low-income and middle-income countries (LMICs) are under-represented in the academic literature. Scientific writing and publishing interventions may help researchers publish their findings; however, we lack evidence about the prevalence and effectiveness of such interventions. This review describes interventions for researchers in LMICs aimed at strengthening capacity for writing and publishing academic journal articles.METHODS: We followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines to report literature searches in PubMed, Embase, Global Health, Scopus and ERIC. Our keywords included LMICs, low-income and middle-income countries, health research and writing/publication support interventions, with no restrictions on publication date. Our screening process consisted of title screening, abstract review and full-text review. We collected information about the content, implementation and evaluation of each intervention, if included.
    RESULTS: We identified 20 interventions designed to strengthen capacity for scientific writing and publishing. We summarised information from the 14 interventions that reported submitted or published papers as outcomes separately, reasoning that because they provide quantifiable metrics of success, they may offer particular insights into intervention components leading to publication. The writing and publishing components in this 'Publications Reported' group were an average length of 5.4 days compared with 2.5 days in the other group we refer to as 'Other Interventions.' Whereas all 14 Publications Reported interventions incorporated mentors, only two of five in the Other Interventions group incorporated mentors. Across interventions, leaders expressed the importance of a high ratio of mentors to participants, the need to accommodate time demands of busy researchers, and the necessity of a budget for open access fees and high-quality internet connectivity.
    CONCLUSION: Writing and publishing interventions in LMICs are an underutilised opportunity for capacity strengthening. To facilitate the implementation of high-quality interventions, future writing and publishing interventions should share their experiences by publishing detailed information about the approach and effectiveness of the interventions.
    Keywords:  systematic review
    DOI:  https://doi.org/10.1136/bmjgh-2021-008059
  3. PLoS One. 2022 ;17(2): e0263023
      Prevalence of research misconduct, questionable research practices (QRPs) and their associations with a range of explanatory factors has not been studied sufficiently among academic researchers. The National Survey on Research Integrity targeted all disciplinary fields and academic ranks in the Netherlands. It included questions about engagement in fabrication, falsification and 11 QRPs over the previous three years, and 12 explanatory factor scales. We ensured strict identity protection and used the randomized response method for questions on research misconduct. 6,813 respondents completed the survey. Prevalence of fabrication was 4.3% (95% CI: 2.9, 5.7) and of falsification 4.2% (95% CI: 2.8, 5.6). Prevalence of QRPs ranged from 0.6% (95% CI: 0.5, 0.9) to 17.5% (95% CI: 16.4, 18.7) with 51.3% (95% CI: 50.1, 52.5) of respondents engaging frequently in at least one QRP. Being a PhD candidate or junior researcher increased the odds of frequently engaging in at least one QRP, as did being male. Scientific norm subscription (odds ratio (OR) 0.79; 95% CI: 0.63, 1.00) and perceived likelihood of detection by reviewers (OR 0.62, 95% CI: 0.44, 0.88) were associated with engaging in less research misconduct. Publication pressure was associated with more often engaging in one or more QRPs frequently (OR 1.22, 95% CI: 1.14, 1.30). We found higher prevalence of misconduct than earlier surveys. Our results suggest that greater emphasis on scientific norm subscription, strengthening reviewers in their role as gatekeepers of research quality and curbing the "publish or perish" incentive system promotes research integrity.
    DOI:  https://doi.org/10.1371/journal.pone.0263023
  4. Hellenic J Cardiol. 2022 Feb 15. pii: S1109-9666(22)00021-5. [Epub ahead of print]
      OBJECTIVE: Journal abstracts are crucial for the identification and initial assessment of content of studies. We evaluated whether authors in the field of cardiovascular diseases (CVDs) reported Diagnostic Test Accuracy Systematic Reviews (DTA SRs) abstracts adequately, as defined by the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA)-DTA guidelines.METHODS: SRs of DTA studies in CVDs published in general and specialized medical journals were identified in a MEDLINE search between 2010-2020. Adherence to 12 PRISMA-DTA for abstracts items was assessed independently by two reviewers and compared by journal's type. Moreover, the association of reporting completeness with different characteristics was investigated.
    RESULTS: We included 72 abstracts. Studies published in general medical journals had higher mean reporting score than those in specialized journals (6.2 vs 5.3 out of 12 items; mean difference: 0.88; 95% confidence interval: 0.21, 1.55). PRISMA-DTA adherence was higher in journals that adopted this guideline and in articles with structured abstracts. However, number of participants analysed, funding and registration were the least-reported items in the identified abstracts.
    CONCLUSION: The reporting of abstracts of DTA reports in CVDs is suboptimal according to PRISMA-DTA guidelines. Abstract reporting could be improved with the use of higher word count limits and the adoption of PRISMA-DTA guidelines especially in specialized journals.
    Keywords:  Abstract; Cardiovascular diseases; Diagnostic accuracy studies; PRISMA-DTA statement; Reporting completeness; Systematic review
    DOI:  https://doi.org/10.1016/j.hjc.2022.02.001
  5. BMC Vet Res. 2022 Feb 18. 18(1): 73
      BACKGROUND: Retractions are a key proxy for recognizing errors in research and publication and for reconciling misconduct in the scientific literature. The underlying factors associated with retractions can provide insight and guide policy for journal editors and authors within a discipline. The goal of this study was to systematically review and analyze retracted articles in veterinary medicine and animal health. A database search for retractions of articles with a veterinary/animal health topic, in a veterinary journal, or by veterinary institution-affiliated authors was conducted from first available records through February 2019 in MEDLINE/PubMed, Web of Science, Scopus, Retraction Watch, and Google Scholar. Annual frequency of retractions, journal and article characteristics, author affiliation and country, reasons for retraction, and retraction outcomes were recorded.RESULTS: Two-hundred-forty-two articles retracted between 1993 and 2019 were included in the study. Over this period, the estimated rate of retraction increased from 0.03/1000 to 1.07/1000 veterinary articles. Median time from publication to retraction was 478 days (range 0-3653 days). Retracted articles were published in 30 (12.3%) veterinary journals and 132 (81.5%) nonveterinary journals. Veterinary journals had disproportionately more retractions than nonveterinary journals (P = .0155). Authors/groups with ≥2 retractions accounted for 37.2% of retractions. Authors from Iran and China published 19.4 and 18.2% of retracted articles respectively. Authors were affiliated with a faculty of veterinary medicine in 59.1% of retracted articles. Of 242 retractions, 204 (84.3%) were research articles, of which 6.4% were veterinary clinical research. Publication misconduct (plagiarism, duplicate publication, compromised peer review) accounted for 75.6% of retractions, compared with errors (20.6%) and research misconduct (18.2%). Journals published by societies/institutions were less likely than those from commercial publishers to indicate a reason for retraction. Thirty-one percent of HTML articles and 14% of PDFs were available online but not marked as retracted.
    CONCLUSIONS: The rate of retraction in the field of veterinary and animal health has increased by ~ 10-fold per 1000 articles since 1993, resulting primarily from increased publication misconduct, often by repeat offenders. Veterinary journals and society/institutional journals could benefit from improvement in the quality of retraction notices.
    Keywords:  Editorial policies; Publication ethics; Publication misconduct; Research misconduct; Veterinary journals
    DOI:  https://doi.org/10.1186/s12917-022-03167-x
  6. J Dent. 2022 Feb 12. pii: S0300-5712(22)00124-5. [Epub ahead of print] 104067
      OBJECTIVES: To investigate whether dental journal articles that are open access (OA) receive greater citation counts and higher Altmetric Attention Scores (AAS) than articles that are non-OA in the long term.METHODS: Eligible dental journal articles published in 2013 were identified via PubMed, and Web of science, Unpaywall and corresponding URLs were manually checked to determine the OA status of each included article 7 years after publication. Citation counts were extracted from Web of Science and Scopus, and AAS was harvested from the Altmetric Explorer. Multivariable general linear regression analyses were performed to investigate the association between OA and citation count, as well as between OA and AAS.
    RESULTS: Among the 755 included articles, 309 (40.9%) were freely available online. Among the 309 OA articles, articles available from publishers accounted for 64.4% (199/309) of all OA articles, and those available through self-archiving accounted for 56.0% (173/309). According to regression analyses, OA articles had significantly greater citation counts (P = 0.001) and AAS (P < 0.001) than non-OA articles.
    CONCLUSIONS/CLINICAL SIGNIFICANCE: In the field of dentistry, about 41% of journal articles are OA 7 years after publication, and OA articles available from the publishers are more common than those from authors through self-archiving. OA articles tend to have greater scientific and social impact than non-OA articles in the long term.
    Keywords:  altmetrics; bibliometrics; dentistry; evidence-based dentistry; open access; research waste
    DOI:  https://doi.org/10.1016/j.jdent.2022.104067
  7. Med Sci (Paris). 2022 Feb;38(2): 215-217
      In order to promote biomedical research, the French ministry of Solidarity and Health has developed a software that is used to rate medical doctors according to their scientific production: the SIGAPS points (System of Interrogation, Management and Analysis of Scientific Publications). These points (1-32 points) are attributed after the publication of a scientific paper according to the quality of the journal (based on the impact factor) and the doctor's rank among the authors. These points are then transformed into a sum of money received by the health facility (1 point = approximately 648 euros), 4 years in a row. Does this "fee-for-service" encourage doctors to publish quickly at any price, regardless of the quality of their research?
    DOI:  https://doi.org/10.1051/medsci/2022006
  8. Eur Urol. 2022 Feb 09. pii: S0302-2838(22)00084-7. [Epub ahead of print]
      Visual abstracts (VAs) are graphical representations of the key findings in manuscripts and have been adopted by many journals to improve content dissemination via social media. We sought to assess whether VAs, compared to key figures (KFs), increased reader engagement via social media using articles published in European Urology. We prospectively randomized 200 consecutive new publications to representation on Twitter and Instagram using either a VA (n = 99) or a KF (n = 101). Randomization was stratified by prostate cancer content. The primary outcome was Twitter impressions. Secondary outcomes included Twitter total engagements, link clicks, likes, and retweets, as well as Instagram likes. Analysis of covariance was conducted using the stratification variable as a covariate. We found that Twitter impressions were greater for tweets containing VAs compared to KFs (8385 vs 6882; adjusted difference 1480, 95% confidence interval [CI] 434-2526; p = 0.006). VA use was also associated with more retweets and likes (p < 0.002), but fewer full-article link clicks than KFs (60 vs 105, adjusted difference 45, 95% CI 21-70; p = 0.0004). The choice between VA and KF should depend on the relative value given to impressions versus full-article link clicks. PATIENT SUMMARY: We found that use of a visual abstract increases the social media reach of new urology articles when compared to key figures from the manuscript, but was associated in a significantly lower click-through rate. In the increasingly virtual world of academic medicine, these findings may assist authors, editors, and publishers with dissemination of new research.
    Keywords:  Instagram; Social media; Twitter; Urology; Visual abstract
    DOI:  https://doi.org/10.1016/j.eururo.2022.01.041
  9. J Magn Reson Imaging. 2022 Feb 15.
      BACKGROUND: Despite the nearly ubiquitous reported use of peer review among reputable medical journals, there is limited evidence to support the use of peer review to improve the quality of biomedical research and in particular, imaging diagnostic test accuracy (DTA) research.PURPOSE: To evaluate whether peer review of DTA studies published by imaging journals is associated with changes in completeness of reporting, transparency for risk of bias assessment, and spin.
    STUDY TYPE: Retrospective cross-sectional study.
    STUDY SAMPLE: Cross-sectional study of articles published in Journal of Magnetic Resonance Imaging (JMRI), Canadian Association of Radiologists Journal (CARJ), and European Radiology (EuRad) before March 31, 2020.
    ASSESSMENT: Initial submitted and final versions of manuscripts were evaluated for completeness of reporting using the Standards for Reporting Diagnostic Accuracy Studies (STARD) 2015 and STARD for Abstracts guidelines, transparency of reporting for risk of bias assessment based on Quality Assessment of Diagnostic Accuracy Studies 2 (QUADAS-2), and actual and potential spin using modified published criteria.
    STATISTICAL TESTS: Two-tailed paired t-tests and paired Wilcoxon signed-rank tests were used for comparisons. A P value <0.05 was considered to be statistically significant.
    RESULTS: We included 84 diagnostic accuracy studies accepted by three journals between 2014 and 2020 (JMRI = 30, CARJ = 23, and EuRad = 31) of the 692 which were screened. Completeness of reporting according to STARD 2015 increased significantly between initial submissions and final accepted versions (average reported items: 16.67 vs. 17.47, change of 0.80 [95% confidence interval 0.25-1.17]). No significant difference was found for the reporting of STARD for Abstracts (5.28 vs. 5.25, change of -0.03 [-0.15 to 0.11], P = 0.74), QUADAS-2 (6.08 vs. 6.11, change of 0.03 [-1.00 to 0.50], P = 0.92), actual "spin" (2.36 vs. 2.40, change of 0.04 [0.00 to 1.00], P = 0.39) or potential "spin" (2.93 vs. 2.81, change of -0.12 [-1.00 to 0.00], P = 0.23) practices.
    CONCLUSION: Peer review is associated with a marginal improvement in completeness of reporting in published imaging DTA studies, but not with improvement in transparency for risk of bias assessment or reduction in spin.
    LEVEL OF EVIDENCE: 3 TECHNICAL EFFICACY STAGE: 1.
    Keywords:  peer review; reporting guidelines; research methods
    DOI:  https://doi.org/10.1002/jmri.28116
  10. Addiction. 2022 Feb 13.
      
    Keywords:  Bias; open science; pre-registration; protocols; reproducibility; statistical analysis plans
    DOI:  https://doi.org/10.1111/add.15819
  11. JAMIA Open. 2022 Apr;5(1): ooac001
      Reproducibility in medical research has been a long-standing issue. More recently, the COVID-19 pandemic has publicly underlined this fact as the retraction of several studies reached out to general media audiences. A significant number of these retractions occurred after in-depth scrutiny of the methodology and results by the scientific community. Consequently, these retractions have undermined confidence in the peer-review process, which is not considered sufficiently reliable to generate trust in the published results. This partly stems from opacity in published results, the practical implementation of the statistical analysis often remaining undisclosed. We present a workflow that uses a combination of informatics tools to foster statistical reproducibility: an open-source programming language, Jupyter Notebook, cloud-based data repository, and an application programming interface can streamline an analysis and help to kick-start new analyses. We illustrate this principle by (1) reproducing the results of the ORCHID clinical trial, which evaluated the efficacy of hydroxychloroquine in COVID-19 patients, and (2) expanding on the analyses conducted in the original trial by investigating the association of premedication with biological laboratory results. Such workflows will be encouraged for future publications from National Heart, Lung, and Blood Institute-funded studies.
    Keywords:  FAIR principles; clinical trial; statistical reproducibility
    DOI:  https://doi.org/10.1093/jamiaopen/ooac001
  12. Wellcome Open Res. 2021 ;6 355
      Background: Numerous mechanisms exist to incentivise researchers to share their data. This scoping review aims to identify and summarise evidence of the efficacy of different interventions to promote open data practices and provide an overview of current research. Methods: This scoping review is based on data identified from Web of Science and LISTA, limited from 2016 to 2021. A total of 1128 papers were screened, with 38 items being included. Items were selected if they focused on designing or evaluating an intervention or presenting an initiative to incentivise sharing. Items comprised a mixture of research papers, opinion pieces and descriptive articles. Results: Seven major themes in the literature were identified: publisher/journal data sharing policies, metrics, software solutions, research data sharing agreements in general, open science 'badges', funder mandates, and initiatives. Conclusions: A number of key messages for data sharing include: the need to build on existing cultures and practices, meeting people where they are and tailoring interventions to support them; the importance of publicising and explaining the policy/service widely; the need to have disciplinary data champions to model good practice and drive cultural change; the requirement to resource interventions properly; and the imperative to provide robust technical infrastructure and protocols, such as labelling of data sets, use of DOIs, data standards and use of data repositories.
    Keywords:  data sharing; open data; open science; research data; scoping review
    DOI:  https://doi.org/10.12688/wellcomeopenres.17286.1
  13. J Obstet Gynecol Neonatal Nurs. 2022 Feb 09. pii: S0884-2175(22)00003-X. [Epub ahead of print]
      XXXX.
    DOI:  https://doi.org/10.1016/j.jogn.2022.01.003
  14. Med Arch. 2021 Dec;75(6): 408-412
      12th Days of the Academy of Medical Sciences of Bosnia and Herzegovina (AMNuBiH) this year were organized together with the International Academy of Sciences and Arts in Bosnia and Herzegovina in Sarajevo on December 4, 2021. The title of the symposium was "Scientometry, Citation, Plagiarism and Predatory in Scientific Publishing". Experiences in the scientific area covered by title of this conference were presented by some of the most influential scientists from Bosnia and Herzegovina, who are included between 2% of authors in the Stanford scientometric list, which was published in October 2021 in the journal Biology Plos. Some of the authors are former or current Editors-in-Chiefs of indexed biomedical journals in Bosnia and Herzegovina, Croatia, North Macedonia (Izet Masic, Asim Kurjak, Doncho Donev, Osman Sinanovic). Also, Sylwia Ufnalska and Izet Masic are or were members of the European Association of Science Editors (EASE) and they have great experiences about the topic of this conference. Science that analyzes scientific papers and their citation in the scientific journals - called scientometrics - day by day has become important for measuring scientific validity and quality of all kinds of publications deposited in the most important on-line scientific databases, like WoS, Scopus, Medline, PubMed Central, Embase, Hinari, etc., but also in academic platforms ResearchGate and Academia.edu. Scientometrics use the Impact and Echo factor for measuring the quality of publications in WoS journals, Scopus uses the h-Index, and the most common one used in the last 10 years is Google Scholar index. All of them have advantages and disadvantages, and also positive and negative influences in the academic praxis. One of the greatest, and sadly too common, problems which participants in the academic process encountered are plagiarism and predatory publishing. In order to prevent this severest form of academic fraud, authors must give credit to someone whose work has helped him/her by citing references correctly. This presentations of the symposium "SWEP 2021") analyzed the major components of scientometrics, the basic mechanisms of citations in medical publications and plagiarism, as an opposition to the primary goal of scientific enterprise: search for truth.
    Keywords:  Citation; Plagiarism; Predatory; Scientometrics
    DOI:  https://doi.org/10.5455/medarh.2021.75.408-412
  15. Zhonghua Gan Zang Bing Za Zhi. 2022 Jan 20. 30(1): 1-3
      The Chinese Journal of Hepatology has a 2020 core impact factor of 1.807, which position it first among the periodicals of gastroenterology. The China Association for Science and Technology classified it as T1 grade and included in the catalogue of high-level scientific and technological periodicals. Since 2021, it has received the special publishing fund of the Chongqing Municipal Bureau of Press and Publications, the High-quality Scientific and Technological Periodicals Funding Project of Chongqing Association for Science and Technology, and the Industry-university-research Cooperation and Collaborative Education Project of the Ministry of Education of the People's Republic of China and won many awards such as "Sichuan-Chongqing First-class Scientific and Technological Periodical" and "Chongqing High-quality Scientific and Technological Periodical", thereby ensuring the development of both qualitative and quantitative effects. Therefore, in 2022, we will work on attracting high-impact research reports, disseminate the academic results timely, efficiently and accurately, highlight the role of digital communication, and pave the way for the establishment of it as a first-class academic journal.
    Keywords:  Message; Plan; Review
    DOI:  https://doi.org/10.3760/cma.j.cn501113-20220112-00017
  16. Front Res Metr Anal. 2021 ;6 751553
      The scholarly knowledge ecosystem presents an outstanding exemplar of the challenges of understanding, improving, and governing information ecosystems at scale. This article draws upon significant reports on aspects of the ecosystem to characterize the most important research challenges and promising potential approaches. The focus of this review article is the fundamental scientific research challenges related to developing a better understanding of the scholarly knowledge ecosystem. Across a range of disciplines, we identify reports that are conceived broadly, published recently, and written collectively. We extract the critical research questions, summarize these using quantitative text analysis, and use this quantitative analysis to inform a qualitative synthesis. Three broad themes emerge from this analysis: the need for multi-sectoral cooperation and coordination, for mixed methods analysis at multiple levels, and interdisciplinary collaboration. Further, we draw attention to an emerging consensus that scientific research in this area should by a set of core human values.
    Keywords:  open access; open science; research ethics; scholarly communications; scientometrics
    DOI:  https://doi.org/10.3389/frma.2021.751553