bims-skolko Biomed News
on Scholarly communication
Issue of 2021–06–13
twenty papers selected by
Thomas Krichel, Open Library Society



  1. Ann Surg. 2021 Jun 02.
       OBJECTIVE: To assess the prevalence, magnitude, and disclosure status of industry funding in editorial boards of surgery journals.
    SUMMARY OF BACKGROUND DATA: Financial conflicts of interest (COI) can bias research. While authors seeking to publish in peer-reviewed surgery journals are required to provide COI disclosures, editorial board members' COI disclosures are generally not disclosed to readers.
    METHODS: We present a cross-sectional analysis of industry funding to editorial board members of high-impact surgery journals. We reviewed top US-based surgery journals by impact factor to determine the presence of financial COI in members of each journal's editorial board. The prevalence and magnitude of COI was determined using 2018 industry reported payments found in the CMS Open Payments database. Journal websites were also reviewed looking for the presence of editorial board disclosure statements.
    RESULTS: A total of 1,002 names of editorial board members from the top 10 high-impact American surgery journals were identified. Of 688 individual physicians based in the USA, 452 (65.7%) were found to have received industry payments in 2018, totaling $21,916,503 with a median funding amount per physician of $1,253 (IQR $156 - $10,769). Funding levels varied by surgical specialty and journal. Editorial board disclosure information was found in only 3.3% of physicians.
    CONCLUSIONS: Industry funding to editorial board members of high impact surgery journals is prevalent and underreported. Mechanisms of disclosure for COI are needed at the editorial board level to provide readers full transparency. This would acknowledge this COI of editorial board members, and thereby attempt to potentially further reduce the risk of bias in editorial decisions.
    DOI:  https://doi.org/10.1097/SLA.0000000000004929
  2. J Anaesthesiol Clin Pharmacol. 2021 Jan-Mar;37(1):37(1): 57-62
       Background and Aims: Publication of a scientific article in a reputed journal is an uphill task that demands a significant amount of time and effort from the author and editorial team. It is a matter of great enthusiasm for all prospective researchers to know whether this daily evolving publication load of articles during this pandemic had changed the journal's inherent peer review or publication process. We aimed to compare the peer review speed of anesthesiology journal articles published during pandemic (2020) to the previous year and to analyze various factors affecting peer review speed.
    Material and Methods: Overall, 16 anesthesiology journals indexed in MEDLINE database were retrospectively analyzed. A set of 24 articles published in 2019 of the included journals were selected from each journal for control and a set of 12 articles published between January to September 2020 was selected for comparison. Time taken for acceptance and publication from the time of submission was noted. Peer review timing was calculated and its relationship with h-index, continent of journal origin and article processing charges were evaluated.
    Results: The median peer review time in 2019 and 2020 were 116 (108-125) days and 79 (65-105.5) days, respectively. There was a 31.8% decrease (P = 0.0021) in peer review time of all articles in 2020 compared to 2019. The median peer review timings of COVID-19 articles were 35 (22-42.5) days. A 55.6% decrease was noted in peer review time of COVID-19 articles compared to non-COVID-19 articles in 2020. There was a significant correlation between peer review time and h-index (r = 0.558, P = 0.024). There was no significant difference in peer review timing of journals with or without article processing charge (P = 0.75) and between journals from different continents (P = 0.56).
    Conclusion: Anesthesiology journals managed to curtail their turnaround time for peer review during the pandemic compared to previous year. Journal with higher h-index had longer peer review time. The option for articles processing charge and continent of publishing journal had no impact on peer review speed.
    Keywords:  Anaesthesiology; COVID-19; h-index; peer review time
    DOI:  https://doi.org/10.4103/joacp.JOACP_652_20
  3. J Clin Psychol Med Settings. 2021 Jun 08.
      Founded in 1994, The Journal of Clinical Psychology in Medical Settings (JCPMS) has paralleled the development of psychology's role in health care as well as contributing to its growth in science, services, and education in medical settings. JCPMS provides an essential, unique publishing outlet for health service psychology as represented by the recognized psychological specialties in those settings. At this point in its development, The Journal has turned its attention to generativity and contributing further to the field by helping prepare the next generation of journal manuscript reviewers and future psychological scientists. A brief developmental history of The Journal and its relationship to the evolution of health service psychology is offered followed by a description of a task-specific mentoring process for a new generation of manuscript reviewers. Building on work by other authors, a competency-based model is used to rearrange previously published guidance into categories of knowledge, skills, and attitudes required to become a competent manuscript reviewer. General competencies are described within each of those categories as well as specific behavioral anchors that a mentee must master in order to carry out a competent review.
    Keywords:  Competency-based education; Health service psychology; Manuscript review; Mentoring
    DOI:  https://doi.org/10.1007/s10880-021-09795-z
  4. BMC Med Res Methodol. 2021 Jun 08. 21(1): 120
       BACKGROUND: Pandemic events often trigger a surge of clinical trial activity aimed at rapidly evaluating therapeutic or preventative interventions. Ensuring rapid public access to the complete and unbiased trial record is particularly critical for pandemic research given the urgent associated public health needs. The World Health Organization (WHO) established standards requiring posting of results to a registry within 12 months of trial completion and publication in a peer reviewed journal within 24 months of completion, though compliance with these requirements among pandemic trials is unknown.
    METHODS: This cross-sectional analysis characterizes availability of results in trial registries and publications among registered trials performed during the 2009 H1N1 influenza, 2014 Ebola, and 2016 Zika pandemics. We searched trial registries to identify clinical trials testing interventions related to these pandemics, and determined the time elapsed between trial completion and availability of results in the registry. We also performed a comprehensive search of MEDLINE via PubMed, Google Scholar, and EMBASE to identify corresponding peer reviewed publications. The primary outcome was the compliance with either of the WHO's established standards for sharing clinical trial results. Secondary outcomes included compliance with both standards, and assessing the time elapsed between trial completion and public availability of results.
    RESULTS: Three hundred thirty-three trials met eligibility criteria, including 261 H1N1 influenza trials, 60 Ebola trials, and 12 Zika trials. Of these, 139 (42%) either had results available in the trial registry within 12 months of study completion or had results available in a peer-reviewed publication within 24 months. Five trials (2%) met both standards. No results were available in either a registry or publication for 59 trials (18%). Among trials with registered results, a median of 42 months (IQR 16-76 months) elapsed between trial completion and results posting. For published trials, the median elapsed time between completion and publication was 21 months (IQR 9-34 months). Results were available within 24 months of study completion in either the trial registry or a peer reviewed publication for 166 trials (50%).
    CONCLUSIONS: Very few trials performed during prior pandemic events met established standards for the timely public dissemination of trial results.
    Keywords:  Clinicaltrials.gov; Ebola; H1N1; Non-publication; Pandemic; Publication bias; Trial registration; Zika
    DOI:  https://doi.org/10.1186/s12874-021-01324-8
  5. Trends Cogn Sci. 2021 Jun 02. pii: S1364-6613(21)00124-8. [Epub ahead of print]
      Peer review is an integral part of scientific life, determining success in publishing, grant applications, and professional appointments. We argue for the importance of neutral language in peer review and provide examples of nonneutral linguistic and stylistic devices that emphasize a reviewer's personal response to the manuscript rather than their objective assessment.
    Keywords:  bias; dissemination; funding; publishing; scientific communication
    DOI:  https://doi.org/10.1016/j.tics.2021.05.003
  6. Indian J Cancer. 2021 Apr-Jun;58(2):58(2): 165-170
       Background: The editors of the Indian Journal of Cancer (IJC) have not, so far, objectively analyzed the editorial processes involving author, referee, and editor data of the journal. Hence, we aimed at doing so in this audit.
    Methods: We retrospectively analyzed manuscripts submitted to the IJC from April 1, 2020, to May 31, 2020, for data related to the peer-review process. Microsoft Excel was used to enter the retrieved information and to carry out the statistical analysis.
    Results: Three hundred and nineteen manuscripts were submitted during the study period. Of these, three were excluded from the study. Of the 316, 79 (25%) were articles on laboratory medicine; 182 (57.6%) were original articles. About half of the submitted manuscripts (166, 52.5%) were desk-rejected. Of the remaining 149 manuscripts, 105 did not follow the instructions to contributors (ITC) and required a median number of two revisions (range = 1-5) to satisfy the ITC. To review 107 manuscripts, 536 external referees were invited; of them 306 did not respond, 79 declined the invitation, and 151 accepted the invitation. Of these 151, 132 reverted with comments. Of the 200 Indians who were invited as referees, 118 (59%) accepted the invitation, whereas of the 336 non-Indian referees, only 33 (9.8%) did. Of the 107 Indian and 25 non-Indian referees who sent their comments, 86 (80.4%) and 19 (88%), respectively, offered useful comments. The median number of days to decision: for desk-rejection was 1 day (range = 0 - 42) days, for rejection after peer-review was 67 (range = 4 - 309) days, and for acceptance was 133.5 (range = 42 - 305) days. Decision has not yet been taken for 14 manuscripts.
    Conclusion: The study provides evidence that it is difficult to get referees. Also, a significant number of authors do not read or follow the ITC. We suggest that the time taken for a decision can be appreciably improved if these issues are addressed.
    Keywords:  Acceptance rate; editorial process; instructions to contributors
    DOI:  https://doi.org/10.4103/ijc.IJC_1319_20
  7. Syst Rev. 2021 Jun 11. 10(1): 175
       BACKGROUND: Systematic reviews appraise and synthesize the results from a body of literature. In healthcare, systematic reviews are also used to develop clinical practice guidelines. An increasingly common concern among systematic reviews is that they may unknowingly capture studies published in "predatory" journals and that these studies will be included in summary estimates and impact results, guidelines, and ultimately, clinical care.
    FINDINGS: There is currently no agreed-upon guidance that exists for how best to manage articles from predatory journals that meet the inclusion criteria for a systematic review. We describe a set of actions that authors of systematic reviews can consider when handling articles published in predatory journals: (1) detail methods for addressing predatory journal articles a priori in a study protocol, (2) determine whether included studies are published in open access journals and if they are listed in the directory of open access journals, and (3) conduct a sensitivity analysis with predatory papers excluded from the synthesis.
    CONCLUSION: Encountering eligible articles published in presumed predatory journals when conducting a review is an increasingly common threat. Developing appropriate methods to account for eligible research published in predatory journals is needed to decrease the potential negative impact of predatory journals on healthcare.
    Keywords:  Meta-analysis; Open access; Predatory journals; Systematic reviews
    DOI:  https://doi.org/10.1186/s13643-021-01733-2
  8. Sci Eng Ethics. 2021 Jun 07. 27(3): 39
      One of the core problems of scientific research authorship is honorary authorship. It violates the ethical principle of clear and appropriate assignment of scientific research contributions. The prevalence of honorary authorship worldwide is alarmingly high across various research disciplines. As a result, many academic institutions and publishers were trying to explore ways to overcome this unethical research practice. The International Committee of Medical Journal Editors (ICMJE) recommended criteria for authorship as guidance for researchers submitting manuscripts to biomedical Journals. However, despite the ICMJE guidelines, honorary authorship is still significantly present across various health research disciplines. The aim of this study was to explore the perceptions and knowledge of health care researchers towards honorary authorship according to the ICMJE guidelines across different health care fields in Jordan, which to our knowledge was never explored before. Data from an electronic survey that was distributed among researchers working in different healthcare fields across several major universities in Jordan, revealed that most of the respondents were assistant professors working mainly in the schools of Medicine and Pharmacy. The majority of the respondents (65.5%) were not aware of the ICMJE authorship guidelines. And, around 37% reported the inclusion of an honorary author, in which the most common non-authorship task reported by 73% of the respondents was reviewing the manuscript. Our findings emphasize the need for national academic and research institutions to address the issue of authorship in their educational programs and internal policies.
    Keywords:  Biomedical research; Honorary authorship; ICMJE guidelines; Online survey; Research misconduct
    DOI:  https://doi.org/10.1007/s11948-021-00317-6
  9. BMC Med Res Methodol. 2021 Jun 06. 21(1): 119
       BACKGROUND: Data-sharing policies in randomized clinical trials (RCTs) should have an evaluation component. The main objective of this case-control study was to assess the impact of published re-uses of RCT data in terms of media attention (Altmetric) and citation rates.
    METHODS: Re-uses of RCT data published up to December 2019 (cases) were searched for by two reviewers on 3 repositories (CSDR, YODA project, and Vivli) and matched to control papers published in the same journal. The Altmetric Attention Score (primary outcome), components of this score (e.g. mention of policy sources, media attention) and the total number of citations were compared between these two groups.
    RESULTS: 89 re-uses were identified: 48 (53.9%) secondary analyses, 34 (38.2%) meta-analyses, 4 (4.5%) methodological analyses and 3 (3.4%) re-analyses. The median (interquartile range) Altmetric Attention Scores were 5.9 (1.3-22.2) for re-use and 2.8 (0.3-12.3) for controls (p = 0.14). No statistical difference was found on any of the components of in the Altmetric Attention Score. The median (interquartile range) numbers of citations were 3 (1-8) for reuses and 4 (1 - 11.5) for controls (p = 0.30). Only 6/89 re-uses (6.7%) were cited in a policy source.
    CONCLUSIONS: Using all available re-uses of RCT data to date from major data repositories, we were not able to demonstrate that re-uses attracted more attention than a matched sample of studies published in the same journals. Small average differences are still possible, as the sample size was limited. However matching choices have some limitations so results should be interpreted very cautiously. Also, citations by policy sources for re-uses were rare.
    TRIAL REGISTRATION: Registration: osf.io/fp62e.
    Keywords:  Altmetric; Attention score; Clinical trial; Data reuse; Data-sharing; Individual Participant Data; Reproducibility; Research impact; Scientific transparency
    DOI:  https://doi.org/10.1186/s12874-021-01311-z
  10. J Korean Med Sci. 2021 Jun 07. 36(22): e162
      Scholarly journals are hubs of hypotheses, evidence-based data, and practice recommendations that shape health research and practice worldwide. The advancement of science and information technologies has made online accessibility a basic requirement, paving the way for the advent of open access publishing, and more recently, to web-based health journalism. Especially in the time of the current pandemic, health professionals have turned to the internet, and primarily to social media, as a source of rapid information transfer and international communication. Hence, the current pandemic has ushered an era of digital transformation of science, and we attempt to understand and assess the impact of this digitization on modern health journalism.
    Keywords:  Medical Journalism; Open Access Publishing; Social Media
    DOI:  https://doi.org/10.3346/jkms.2021.36.e162
  11. BMJ Open Sci. 2020 Jul 20. 4(1): e100115
      Reproducible science requires transparent reporting. The ARRIVE guidelines (Animal Research: Reporting of In Vivo Experiments) were originally developed in 2010 to improve the reporting of animal research. They consist of a checklist of information to include in publications describing in vivo experiments to enable others to scrutinise the work adequately, evaluate its methodological rigour and reproduce the methods and results. Despite considerable levels of endorsement by funders and journals over the years, adherence to the guidelines has been inconsistent, and the anticipated improvements in the quality of reporting in animal research publications have not been achieved. Here, we introduce ARRIVE 2.0. The guidelines have been updated and information reorganised to facilitate their use in practice. We used a Delphi exercise to prioritise and divide the items of the guidelines into two sets, the 'ARRIVE Essential 10', which constitutes the minimum requirement, and the 'Recommended Set', which describes the research context. This division facilitates improved reporting of animal research by supporting a stepwise approach to implementation. This helps journal editors and reviewers verify that the most important items are being reported in manuscripts. We have also developed the accompanying Explanation and Elaboration document, which serves (1) to explain the rationale behind each item in the guidelines, (2) to clarify key concepts and (3) to provide illustrative examples. We aim, through these changes, to help ensure that researchers, reviewers and journal editors are better equipped to improve the rigour and transparency of the scientific process and thus reproducibility.
    DOI:  https://doi.org/10.1136/bmjos-2020-100115
  12. Nature. 2020 Jun 09.
      
    Keywords:  Computer science; Databases; Publishing; SARS-CoV-2
    DOI:  https://doi.org/10.1038/d41586-020-01733-7
  13. Sci Context. 2020 Sep;33(3): 273-297
      In the last decades, many changes have occurred in scientific publishing, including online publication, data repositories, file formats and standards. The role played by computers in this process rekindled the argument on forms of technical determinism. This paper addresses this old debate by exploring the case of publishing processes in prehistoric archaeology during the second part of the twentieth century, prior to the wide-scale adoption of computers. It investigates the case of a collective and international attempt to standardize the typological analysis of prehistoric lithic objects, coined typologie analytique by Georges Laplace and developed by a group of French, Italian, and Spanish researchers. The aim of this paper is to: 1) present a general bibliometric scenario of prehistoric archaeology publishing in continental Europe; 2) report on the little-known typologie analytique method in archaeology, using publications, archives, and interviews; 3) show how the publication of scientific production was shaped by social (editorial policies, support networks) and material (typography features and publication formats) constraints; and 4) highlight how actors founded resources to control and counterbalance these effects, namely by changing and improving publishing formats.
    Keywords:  prehistoric archaeology; print history; scientific publishing; standardization; typography; typology
    DOI:  https://doi.org/10.1017/S0269889721000053
  14. J Clin Epidemiol. 2021 Jun 03. pii: S0895-4356(21)00174-8. [Epub ahead of print]
       BACKGROUND: Systematic reviews (SRs) are useful tools in synthesising the available evidence, but high numbers of overlapping SRs are also discussed in the context of research waste. Although it is often claimed that the number of SRs being published is increasing steadily, there are no precise data on that. We aimed to assess trends in the epidemiology and reporting of published SRs over the last 20 years.
    METHODS: A retrospective observational study was conducted to identify potentially eligible SRs indexed in PubMed from 2000 to 2019. From all 572,871 records retrieved, we drew a simple random sample of 4,000. The PRISMA-P definition of SRs was applied to full texts and only SRs published in English were included. Characteristics were extracted by one reviewer, with a 20% sample verified by a second person.
    RESULTS: A total of 1,132 SRs published in 710 different journals were included. The estimated number of SRs indexed in 2000 was 1,432 (95% CI: 547-2,317), 5,013 (95% CI: 3,375-6,650) in 2010 and 29,073 (95% CI: 25,445-32,702) in 2019. Transparent reporting of key items increased over the years. About 7 out of 10 named their article a SR (2000-2004: 41.9% and 2015-2019: 74.4%). In 2000-2004, 32.3% of SRs were based in the UK (0% in China), in 2015-2019 24.0% were from China and 10.8% from the UK. Nearly all articles from China (94.9%) conducted a meta-analysis (overall: 58.9%). Cochrane reviews (n=84; 7.4%) less often imposed language restrictions, but often did not report the number of records and full texts screened and did not name their article a SR (22.6% vs. 73.4%).
    CONCLUSIONS: We observed a more than 20-fold increase in the number of SRs indexed over the last 20 years. In 2019, this is equivalent to 80 SRs per day. Over time, SRs got more diverse in respect to journals, type of review, and country of corresponding authors. The high proportion of meta-analyses from China needs further investigation.
    STUDY REGISTRATION: Open Science Framework (https://osf.io/pxjrv/).
    Keywords:  Cochrane Review; Evidence-Based Practice; Meta-Analysis; Reporting; Systematic Review; Trends
    DOI:  https://doi.org/10.1016/j.jclinepi.2021.05.022