Search This Blog

Monday, June 29, 2020

New journal to vet Covid preprints, tag misinformation, highlight credible data

The wild, wild west of Covid-19 preprints is about to get a new sheriff. On Monday, the MIT Press is announcing the launch of an open access journal that will publish reviews of preprints related to Covid-19, in an effort to quickly and authoritatively call out misinformation as well as highlight important, credible research.
“Preprints have been a tremendous boon for scientific communication, but they come with some dangers, as we’ve seen with some that have been based on faulty methods,” said Nick Lindsay, director of journals at the MIT Press, which will publish Rapid Reviews: Covid-19. “We want to debunk research that’s poor and elevate research that’s good.”
The Covid-19 pandemic has produced a fire hose of preprints (papers posted to servers such as bioRxiv and medRxiv without peer review), many of questionable validity. The poster child for that is a bioRxiv preprint that suggested the new coronavirus had somehow been engineered from HIV; it was quickly withdrawn. But many other preprints, while not clearly wrong, used weak methodology, such as small numbers of patients or inadequate controls, as in an experiment concluding that a commercially available immunoglobulin might protect against the disease.
“There have definitely been some crummy preprints,” said Richard Sever, a co-founder of both bioRxiv, in 2013, and medRxiv, whose methodical rollout a year ago accelerated to warp speed with the pandemic and has now posted some 5,000 papers about it. “Quite a lot of people are talking about doing something like [the MIT Press effort]. Their challenge is getting people to do the reviews quickly. It’s a great idea but might be easier said than done.”
The editor-in-chief of Rapid Reviews: Covid-19, Stefano Bertozzi of the University of California, Berkeley, thinks this project has a secret sauce that similar efforts do not. It will use an artificial intelligence system developed at Lawrence Berkeley National Laboratory to categorize new preprints by discipline (such as epidemiology or clinical care) and degree of novelty.
“There is such a huge volume of material every day, our goal is to do rapid reviews on preprints that are most interesting,” Bertozzi said. “Interesting” means studies that might influence public health officials, clinicians, and the public, he said, “as well as those that need to be validated or debunked, especially if they’re getting a lot of attention in the media or social media.”
That attention can come almost instantly, posing a challenge for a journal with “rapid” in its name. The AI sifter should speed up the process at the front end. Humans will also weigh in, with about 100 volunteer graduate students from around the world scanning preprints for those that most need review.
Once a preprint has been flagged, Bertozzi and his editors will ask up to three experts to review its strengths and limitations, with or without their name attached to the review.
Both medRxiv and bioRxiv would “absolutely” indicate whether a preprint has been given a Rapid Review, Sever said, just as it does when a preprint is published in a journal. “One of our missions is to alert readers to relevant conversations,” he said.
The first reviews should be up by mid-July, with an aim of posting them within seven to 10 days of a preprint appearing.
The closest similar effort went live in April at Johns Hopkins University, where epidemiologist Emily Gurley and pathologist Kate Grabowski launched the 2019 Novel Coronavirus Research Compendium. Its 50 volunteers, mostly from Hopkins, include experts in mathematical modeling, diagnostics, vaccines, and related fields. Using keyword searches, they select new studies, both preprints and those published in journals, that they think contain important information for clinicians and policymakers. Postdoctoral fellows and graduate students summarize the paper’s findings, and flag its strengths and limitations. Two Hopkins faculty members vet and edit the reviews, which recently passed 220.
“Our objective is to be sure folks, especially clinicians on the frontlines, can find the information they need,” Gurley said. “No one has time to go through all the papers that are coming out” on Covid-19.
Rapid Reviews: Covid-19 plans to tap a pool of 1,600 potential reviewers from hundreds of institutions, and will analyze papers on the economics and anthropology of the pandemic as well as biomedicine. If it lives up to its founders’ hopes, it would be the largest formal effort to ride herd on Covid-19 preprints. It also plans to publish original research from areas of the world that have been underrepresented in Western journals.
Praising the Hopkins project, Lindsay said, “The reality is, there’s so much Covid-19 research out there, there’s going to be plenty for all of us to examine.”

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.