bims-skolko Biomed News
on Scholarly communication
Issue of 2019–11–10
nineteen papers selected by
Thomas Krichel, Open Library Society



  1. PLoS One. 2019 ;14(11): e0224541
      In this article we discuss the five yearly screenings for publications in questionable journals which have been carried out in the context of the performance-based research funding model in Flanders, Belgium. The Flemish funding model expanded from 2010 onwards, with a comprehensive bibliographic database for research output in the social sciences and humanities. Along with an overview of the procedures followed during the screenings for articles in questionable journals submitted for inclusion in this database, we present a bibliographic analysis of the publications identified. First, we show how the yearly number of publications in questionable journals has evolved over the period 2003-2016. Second, we present a disciplinary classification of the identified journals. In the third part of the results section, three authorship characteristics are discussed: multi-authorship, the seniority-or experience level-of authors in general and of the first author in particular, and the relation of the disciplinary scope of the journal (cognitive classification) with the departmental affiliation of the authors (organizational classification). Our results regarding yearly rates of publications in questionable journals indicate that awareness of the risks of questionable journals does not lead to a turn away from open access in general. The number of publications in open access journals rises every year, while the number of publications in questionable journals decreases from 2012 onwards. We find further that both early career and more senior researchers publish in questionable journals. We show that the average proportion of senior authors contributing to publications in questionable journals is somewhat higher than that for publications in open access journals. In addition, this paper yields insight into the extent to which publications in questionable journals pose a threat to the public and political legitimacy of a performance-based research funding system of a western European region. We include concrete suggestions for those tasked with maintaining bibliographic databases and screening for publications in questionable journals.
    DOI:  https://doi.org/10.1371/journal.pone.0224541
  2. EMBO Rep. 2019 Nov 03. e49472
      Peer review to allocate funding for researchers and projects has faced difficulties lately and come under criticism. Various alternatives and improvements are being tested to address these problems.
    DOI:  https://doi.org/10.15252/embr.201949472
  3. Nature. 2019 Nov;575(7781): 51
      
    Keywords:  Funding; Publishing
    DOI:  https://doi.org/10.1038/d41586-019-03389-4
  4. BMJ. 2019 11 06. 367 l5896
       OBJECTIVE: To assess the effect of disclosing authors' conflict of interest declarations to peer reviewers at a medical journal.
    DESIGN: Randomized controlled trial.
    SETTING: Manuscript review process at the Annals of Emergency Medicine. PARTICIPANTS: Reviewers (n=838) who reviewed manuscripts submitted between 2 June 2014 and 23 January 2018 inclusive (n=1480 manuscripts).
    INTERVENTION: Reviewers were randomized to either receive (treatment) or not receive (control) authors' full International Committee of Medical Journal Editors format conflict of interest disclosures before reviewing manuscripts. Reviewers rated the manuscripts as usual on eight quality ratings and were then surveyed to obtain "counterfactual scores"-that is, the scores they believed they would have given had they been assigned to the opposite arm-as well as attitudes toward conflicts of interest.
    MAIN OUTCOME MEASURE: Overall quality score that reviewers assigned to the manuscript on submitting their review (1 to 5 scale). Secondary outcomes were scores the reviewers submitted for the seven more specific quality ratings and counterfactual scores elicited in the follow-up survey.
    RESULTS: Providing authors' conflict of interest disclosures did not affect reviewers' mean ratings of manuscript quality (Mcontrol=2.70 (SD 1.11) out of 5; Mtreatment=2.74 (1.13) out of 5; mean difference 0.04, 95% confidence interval -0.05 to 0.14), even for manuscripts with disclosed conflicts (Mcontrol= 2.85 (1.12) out of 5; Mtreatment=2.96 (1.16) out of 5; mean difference 0.11, -0.05 to 0.26). Similarly, no effect of the treatment was seen on any of the other seven quality ratings that the reviewers assigned. Reviewers acknowledged conflicts of interest as an important matter and believed that they could correct for them when they were disclosed. However, their counterfactual scores did not differ from actual scores (Mactual=2.69; Mcounterfactual=2.67; difference in means 0.02, 0.01 to 0.02). When conflicts were reported, a comparison of different source types (for example, government, for-profit corporation) found no difference in effect.
    CONCLUSIONS: Current ethical standards require disclosure of conflicts of interest for all scientific reports. As currently implemented, this practice had no effect on any quality ratings of real manuscripts being evaluated for publication by real peer reviewers.
    DOI:  https://doi.org/10.1136/bmj.l5896
  5. Paediatr Respir Rev. 2019 May 11. pii: S1526-0542(19)30040-5. [Epub ahead of print]
      Ethics has been defined as the way we ought to behave. Medical publishing essentially exists to broadcast current and new medical knowledge to aid in the practice of medicine. In this review article we consider many of the aspects of medical publishing with regard to 'what we ought to do' and, equally, 'what we ought not to do' from the perspective of various ethical frameworks. Although ethics is not the law or a set of rules, nor a code of conduct, an ethical lens can be useful when developing good general guidelines for medical publishing.
    Keywords:  Equality; Gender; Peer review; Plagiarism
    DOI:  https://doi.org/10.1016/j.prrv.2019.04.005
  6. Mayo Clin Proc. 2019 Nov;pii: S0025-6196(19)30762-1. [Epub ahead of print]94(11): 2272-2276
      The International Committee of Medical Journal Editors requires authors to disclose all financial conflicts of interest (COI) that can be perceived as influencing the related trials. Undisclosed financial COI may influence the perception of the authors' scientific impartiality and erode the public trust in the reported results. Data regarding completeness of COI disclosure in high-impact-factor general medicine journals are limited. We compared payments disclosed by US-based physicians who were first or last authors of clinical drug trials published between August 2016 and August 2018 in the New England Journal of Medicine, JAMA, and Lancet, to payments reported by industry to the Centers for Medicare & Medicaid Services Open Payments Database. Of 247 included authors, 198 (80%) have not disclosed some or all received payments. The median undisclosed sum was $8409 (US Dollars) (interquartile range [IQR] $123 to $44,890). Most authors (n=170, 69%) have received more than $10,000 per year (median $120,403, IQR $58,905 to $242,014). The median undisclosed sum for these authors was $26,530 (IQR $7462 to $71,562). Median undisclosed sums for authors of papers from studies performed with and without industry funding were $20,899 (IQR $4191 to $59,883) and $149 (IQR $0 to $3276), respectively. In 10 (8%) of 125 industry-funded trials, the first or last author had not disclosed personal payments from the study sponsor (median $9741, IQR $4508 to $101,484). These findings could raise concerns about the authors' equipoise toward the trial results and influence the public perception of the credibility of reported data. Health care professionals, reviewers, and journal editors should demand more transparent reporting of financial COI.
    DOI:  https://doi.org/10.1016/j.mayocp.2019.08.025
  7. Nature. 2019 Nov;575(7781): 32-34
      
    Keywords:  Publishing; Research data; Research management
    DOI:  https://doi.org/10.1038/d41586-019-03308-7
  8. J Med Ethics. 2019 Nov 08. pii: medethics-2019-105737. [Epub ahead of print]
       OBJECTIVE: A high prevalence of authorship problems can have a severe impact on the integrity of the research process. We evaluated the authorship practices of clinicians from the same university hospital in 2019 to compare them with our 2003 data and to find out if the practices had changed.
    METHODS: Practitioners were randomly selected from the hospital database (Hospices Civils de Lyon, France). The telephone interviews were conducted by a single researcher (HM) using a simplified interview guide compared with the one used in 2003. The doctors were informed that their answers would be aggregated without the possibility of identifying the respondents. During the interviews, the researcher ticked the boxes with the answers on a paper file.
    RESULTS: We interviewed 26 clinicians (mean age 49±8 years) from various medical specialties. They were unfamiliar with the ICMJE (International Committee of Medical Journal Editors) criteria for writing medical articles and felt that these criteria were not well met in general. With regard to ways of reducing the practice of honorary authors, the participants clearly felt that asking for a signature was hypocritical and of little use. The ghost authors were well known; this practice was considered as rather rare. The 'publish or perish' has always been cited as being responsible for bad practices (26/26: 100%). We compared these results with those observed in 2003 and no improvement has been observed in the past 15 years.
    CONCLUSION: For the second time in France, within a 15-year interval, we have shown that the ICMJE criteria were ignored and that honorary authorship was frequent.
    Keywords:  clinical ethics; professional misconduct; publication ethics; research ethics; scientific research
    DOI:  https://doi.org/10.1136/medethics-2019-105737
  9. Emerg Med J. 2019 Nov 06. pii: emermed-2019-208629. [Epub ahead of print]
       OBJECTIVE: We investigated the association between the publication of the Consolidated Standards of Reporting Trials extension for abstracts (CONSORT-EA) and other variables of interest on the quality of reporting of abstracts of randomised controlled trials (RCTs) published in emergency medicine (EM) journals.
    METHODS: We performed a survey of the literature, comparing the quality of reporting before (2005-2007) with after (2014-2015) the publication of the dedicated CONSORT-EA in 2008. The quality of reporting was measured as the sum of items of the CONSORT-EA checklist reported in each abstract, ranging from 0 to 15. The main explanatory variable was the period of publication: pre-CONSORT-EA versus post-CONSORT-EA public. Other explanatory variables were journal's endorsement of the CONSORT statement, number of centres participating in the study, study's sample size, type of intervention, significance of results, source of funding and study setting. We analysed the data using generalised estimation equations, performing a univariate and a multivariable analysis.
    RESULTS: We retrieved 844 articles, and randomly selected 60 per period for review, after stratifying for journal. The mean (SD) number of items reported was 6.4 (1.9) in the period before and 6.9 (1.8) in the period after the publication of the CONSORT-EA, with an adjusted mean difference (aMD) of 0.47 (95% CI -0.13 to 1.06). Abstracts of trials of pharmacological interventions had a significantly larger mean number of reported items than those of trials of non-pharmacological interventions (aMD 1.59; 95% CI 0.94 to 2.24).
    CONCLUSIONS: The quality of reporting in abstracts of RCTs published in EM journals is low and was not significantly impacted by the publication of a dedicated CONSORT-EA.
    Keywords:  methods; publication; quality; research
    DOI:  https://doi.org/10.1136/emermed-2019-208629
  10. Handb Exp Pharmacol. 2019 Nov 07.
      Scholarly publishers can help to increase data quality and reproducible research by promoting transparency and openness. Increasing transparency can be achieved by publishers in six key areas: (1) understanding researchers' problems and motivations, by conducting and responding to the findings of surveys; (2) raising awareness of issues and encouraging behavioural and cultural change, by introducing consistent journal policies on sharing research data, code and materials; (3) improving the quality and objectivity of the peer-review process by implementing reporting guidelines and checklists and using technology to identify misconduct; (4) improving scholarly communication infrastructure with journals that publish all scientifically sound research, promoting study registration, partnering with data repositories and providing services that improve data sharing and data curation; (5) increasing incentives for practising open research with data journals and software journals and implementing data citation and badges for transparency; and (6) making research communication more open and accessible, with open-access publishing options, permitting text and data mining and sharing publisher data and metadata and through industry and community collaboration. This chapter describes practical approaches being taken by publishers, in these six areas, their progress and effectiveness and the implications for researchers publishing their work.
    Keywords:  Data sharing; Open access; Open science; Peer review; Publishing; Reporting guidelines; Reproducible research; Research data; Scholarly communication
    DOI:  https://doi.org/10.1007/164_2019_290
  11. F1000Res. 2019 ;8 1517
      Open access policies have been progressing since the beginning of this century. Important global initiatives, both public and private, have set the tone for what we understand by open access. The emergence of tools and web platforms for open access (both legal and illegal) have placed the focus of the discussion on open access to knowledge, both for academics and for the general public, who finance such research through their taxes, particularly in Latin America. This historically unnoticed discussion must, we believe, be discussed publicly, given the characteristics of the Latin American scientific community, as well as its funding sources. This article includes an overview of what is meant by open access and describes the origins of the term, both in its philosophical sense and in its practical sense, expressed in the global declarations of Berlin and Bethesda. It also includes the notion of open access managed (or not) by some reputable institutions in Chile, such as CONICYT (National Commission for Scientific and Technological Research) and higher education institutions reputed nationally, such as the Universdad de Chile and Pontificia Universidad Católica de Chile. Various Latin American initiatives related to open access (Scielo, Redalyc, among others) are described, as well as the presence of Chilean documents in those platforms. The national institutional repositories are listed, as well as their current status and a discussion about what open access has implied in Latin America and its importance for the replicability of the investigations carried out locally. Finally, we describe some governmental initiatives (mainly legislative) at the Latin American level and propose some recommendations regarding the promotion and implementation of repositories for the access to scientific data (for access and replication purposes) of the national research.
    Keywords:  Chile; Latin America; Open Access; Repositories; Scientific Data
    DOI:  https://doi.org/10.12688/f1000research.19976.1
  12. J Neurosurg Anesthesiol. 2019 Nov 06.
       BACKGROUND: Randomized controlled trials (RCTs) are considered to provide high levels of evidence to optimize decision-making for patient care, although there can be a risk bias in their design, conduct, and analysis. Quality assessment of RCTs is necessary to assess whether they provide reliable results with little bias.
    MATERIALS AND METHODS: We assessed the reporting quality of RCTs published in the Journal of Neurosurgical Anesthesiology (JNA) between January 1, 2000 and December 31, 2017 using the Jadad scale, van Tulder scale, and Cochrane Collaboration Risk of Bias Tool (CCRBT).
    RESULTS: We identified 130 RCTs and 570 original articles. Among the 130 RCTs, 92 (70.8%) presented an appropriate blinding method, and 70 (53.8%) described an appropriate allocation method. For the entire period, the percentages of high-quality reporting articles were 71.5%, 73.1%, and 13.8% in the Jadad scale, van Tulder scale, and CCRBT assessments, respectively. There was an improvement in the van Tulder scale over time (coefficients [95% confidence interval {CI}]=0.08 [0.01-0.15]; P=0.02). Appropriate reporting of allocation in the Jadad scale (coefficients [95% CI]=1.68 [1.28-2.07]; P<0.001) and van Tulder scale (coefficients [95% CI]=2.34 [1.97-2.70]; P<0.001), and reporting of blinding in the Jadad (coefficients [95% CI]=1.09 [0.66-1.52]; P<0.001) and van Tulder scores (coefficients [95% CI]=1.85 [1.45- 2.25]; P<0.001), were associated with high-quality reporting.
    CONCLUSIONS: The ratio of high-quality reporting RCTs in JNA was consistently high compared with other journals. Thorough consideration of allocation concealment during the peer review process can further improve the reporting quality of RCTs in JNA.
    DOI:  https://doi.org/10.1097/ANA.0000000000000662
  13. Eur Ann Otorhinolaryngol Head Neck Dis. 2019 Nov 04. pii: S1879-7296(19)30170-X. [Epub ahead of print]
       OBJECTIVES: To evaluate the use of P-values and the terms "significant", "non-significant" and "suggestive" in Abstracts in the European Annals of Otorhinolaryngology, Head & Neck Diseases.
    MATERIALS AND METHODS: Consecutive articles accepted for publication during the period January 2016 - February 2019 were systematically reviewed. Main goal: descriptive analysis of the citation of P-values and use of the terms "significant", "non-significant" and "suggestive" in Abstracts. Secondary goal: analytic study of: (i) correlations between citation of a P-value and the main characteristics of authors and topics; and (ii) misuse of the terms "significant", "non-significant" and "suggestive" with respect to cited P-values, and correlations with author and topic characteristics.
    RESULTS: In all, 91 articles were included. P-values and the terms "significant", "non-significant" and "suggestive" were cited in 35.1%, 41.7%, 10.9% and 0% of Abstracts, respectively. Citing a P-value did not significantly correlate with author or topic characteristics. There were discrepancies between the terms "non-significant", "significant" and "suggestive" and P-values given in the body of the article in 57.1% of Abstracts, with 30.7% overestimation and 25.2% underestimation of results, without significant correlation with author or topic characteristics.
    CONCLUSION: Authors, editors and reviewers must pay particular attention to the spin resulting from inappropriate use of the terms "significant", "non-significant" and "suggestive" in Abstracts of articles submitted to the European Annals of Otorhinolaryngology, Head & Neck Diseases, to improve the rigor, quality and value of the scientific message delivered to the reader.
    Keywords:  Medical writing; P value; Scientific report; Significant; Spin.; Statistics; Suggestive
    DOI:  https://doi.org/10.1016/j.anorl.2019.10.008
  14. Nature. 2019 Nov;575(7781): 51
      
    Keywords:  History; Publishing
    DOI:  https://doi.org/10.1038/d41586-019-03386-7
  15. Nature. 2019 Nov;575(7781): 247-248
      
    Keywords:  Computational biology and bioinformatics; Computer science; Peer review; Software; Technology
    DOI:  https://doi.org/10.1038/d41586-019-03366-x
  16. Nature. 2019 Nov;575(7781): 25-28
      
    Keywords:  Arts; Culture; History; Publishing; Research management
    DOI:  https://doi.org/10.1038/d41586-019-03306-9
  17. J Diabetes Investig. 2019 Nov 05.
      Scientists and publishers worldwide are closely following developments in the European Open Access landscape following the announcement that "top European research funders announce 'Plan S' to make all scientific works free to read"1),2) . "No science should be locked behind paywalls": this powerful declaration was released by 11 agencies on September 4, 20181) . It is difficult as Editor-in-Chief of Journal of Diabetes Investigation (JDI)to ignore this new challenge. Leaving that topic to be discussed later, it is worth first looking back on JDI's progress over the past ten years.
    DOI:  https://doi.org/10.1111/jdi.13174
  18. Nature. 2019 Nov;575(7781): 7-8
      
    Keywords:  History; Publishing
    DOI:  https://doi.org/10.1038/d41586-019-03304-x