🧟 Beware of the zombie trials:
More than one-quarter of a subset of manuscripts describing randomized clinical trials submitted to the journal Anaesthesia between 2017 and 2020 seemed to be faked or fatally flawed when their raw data could be examined, editor John Carlisle reported. He called these ‘zombies’. But when their raw data could not be obtained, Carlisle could label only 1% as zombies.
Good thing that science is a strong link problem, because too many of the links are just sawdust and dreams. (via Derek Lowe)
Sometimes, that small print does matter
There is predatory, and then there is predatory:
When Björn Johansson received an email in July 2020 inviting him to speak at an online debate on COVID-19 modeling, he didn’t think twice. “I was interested in the topic and I agreed to participate,” says Johansson, a medical doctor and researcher at the Karolinska Institute. “I thought it was going to be an ordinary academic seminar. It was an easy decision for me.”
…
All the scientists interviewed by Science say Ferensby’s initial messages never mentioned conference fees. When one speaker, Francesco Piazza, a physicist now at the University of Florence, directly asked Ferensby whether the organizers would request a fee, Ferensby replied, “No, we are talking about science and COVID-19.”
But after the events, the speakers were approached by a conference secretary, who asked them to sign and return a license agreement that would give Villa Europa—named in the document as the conference organizer—permission to publish the webinar recordings. Most of the contracts Science has seen state that the researcher must pay the company €790 “for webinar debate fees and open access publication required for the debate proceedings” plus €2785 “to cover editorial work.” These fees are mentioned in a long clause in the last page of the contract, and are written out in words rather than numbers, without any highlighting.
What an absolute nightmare. Predatory journals at least have the decency to ask you for them money up front.
And let’s take a moment to contemplate the ridiculousness of the current academic publishing and conference model. Note that there is nothing unusual in academic conferences requiring attendance fees from speakers. If you have an scientific abstract accepted for oral or poster presentation at ASCO, let’s say, you will still have to pony up for the registration fee. And publication fees for a legitimate open access journal can be north of $3,000. So how is a judge to know whether the organizer’s claims are legitimate?
The difference, of course, is that the good ones — both journals and conferences — don’t solicit submissions; you have to beg them to take your money. Which only makes the situation more ridiculous, not less.
The definition of cancer, with a few side notes on impact factor
A group of cancer researchers proposes an updated definition of cancer:
While reflecting past insights, current definitions have not kept pace with the understanding that the cancer cell is itself transformed and evolving. We propose a revised definition of cancer: Cancer is a disease of uncontrolled proliferation by transformed cells subject to evolution by natural selection.
I like it!
Side note: the opinion came out in Molecular Cancer Research. It is a journal published by a reputable organization with an impact factor of 5.2. No shame in that, but… Many predatory journals now have IFs that are the same or even higher See also: Goodhart’s law. And also, this is a good example of why a metric becomes meaningless over time without context, or at least a denominator.so unless the impact factor is mid-to-high double digits, it no longer carries much information on the journal’s credibility or readership.
A side note to the side note: a paper published in a journal listed as predatory is the second-highest cited of any I co-authored: 100 and counting. I also think it is a very good paper, although a review article getting that many citations is a sign that too many people are not citing primary literature, which is bad! And my most highly cited paper is also a review! This is embarassing for me, but speaks even worse for the people doing all that review-citing. But maybe having a journal listed as predatory no longer carries much information on the articles there not being worth a read?
Stop the presses: I have just found out that Derek Lowe’s Science column In the Pipeline has an RSS feed. Years ago, I tried adding it to my then-feedreader of choice, but either I was not persistent enough in my search, or it just wasn’t there back then. Either way, I’m happy.
From ignore the code, the most sensible explanation of the Monty Hall problem I have yet seen. Yes, changing your mind once more information is available is a good idea. The devil is in establishing what constitutes new information. For better or worse, life is not a game show.
Cool resource alert: Improving Your Statistical Inferences by Daniël Lakens has the best introduction to p values I’ve seen on the free web. Frank Harrel’s Biostatistics for Biomedical Research has been available for a while, but only suitable for advanced readers.
That feeling you get when something a long time coming finally does come out
I have always admired prolific writers like Matthew Yglesias and Scott Alexander — both now on Substack, and not by accident — for their ability to produce tens of thousands of words daily, My admiration being tampered somewhat by ChatGPT and other LLMs, which are about as intellectually and factually rigorous as Alexander, and slightly less so than Yglesias; some sacrifices do have to be made in the name of productivity. on top of the random bite-sized thoughts posted on social media. There are only so many words I can read and write in a day, and for the better part of the last year, my language IO has been preoccupied by helping clean, analyze, interpret, and write up the results of a single clinical trial, which are now finally out in The Lancet Neurology. Yes, my highest impact factor paper to date is in a neurology journal. Go figure.
The paper is about our clinical trial which used the body’s own immune system to treat autoimmune disease — and a particular one at that, myasthenia gravis — via technology that up until now has only been used against cancer (CAR T cells). It has made a decent impact since it came out less than two days ago. It got a write-up in The Economist, for one. Endpoints News as well. Evaluate Vantage got the best quote — it is at the very end of the article. And there is a whole bunch of press releases: from National Institutes of Health, University of North Carolina, Oregon Health and Sciences University, and of course Cartesian Therapeutics.
What went on yesterday reminded me that Twitter is not going anywhere any time soon: all of the above releases were to be found only there, not on a Mastodon instance, the journal’s own media metrics do not — and can not, at least not easily — trawl the Fediverse for hits, and I can’t just type in “Descartes–08”, “myasthenia gravis CAR-T”, or “Cartesian” into a Mastodon search box and get anything of relevance. One could, of course, argue that you wouldn’t get anything of relevance on Twitter either, most of the discussion consisting of people who have barely read the tweet, let alone the article. And one would be correct. And while most of the non-Web3/crypto tech world has moved out, it looks like people in most other fields, from medicine to biotechnology to the NBA commentariat, are maintaining substantial Twitter presence.
This will, of course, have no impact on my commitment to staying out of the conversation to the extent possible while maintaining a semi-regular schedule of 500-character posts, which may now, IO bandwidth having opened up, become a tiny bit longer. Thank you for reading!
After seeing a friend and collaborator yet again plant foot firmly in mouth, I begin to see a pattern. A course on ergodicity should be a requirement for a public health degree, since the masters of public health keep getting it wrong (see also: the screening colonoscopy debate).
Any text about building a better NIH will have my interest, and the Brookings Institute came up with not one such text but eight! To my bemusement, zero of the nine authors responsible have any background in biomedical science. Could that be why some of them were so good?
There is a strong belief among scientists and lay people alike that:
- more information is always better (it isnn’t)
- hard numbers are the only valid form of information (they aren’t)
Combine the two fallacies and you get to today’s level of misinformation and self-deception.