由 Thomas Dworetzky
, Contributing Reporter | November 27, 2018
Stanford scientists have reported new success making diagnoses of 14 different conditions with their artificial intelligence algorithm, CheXNeXt.
The algorithm scans chest X-rays at high speed and checks for all the pathologies at the same time. with results that were nearly as good as human radiologists – in 10 diseases it was their equal, in three it did less well and in one condition it did better, researchers reported in the journal PLOS
“Usually, we see AI algorithms that can detect a brain hemorrhage or a wrist fracture – a very narrow scope for single-use cases,” Dr. Matthew Lungren, assistant professor of radiology said in a Stanford report of the research. “But here we’re talking about 14 different pathologies analyzed simultaneously, and it’s all through one algorithm.”
Universal Medical provides the very best new & refurbished gamma cameras, quality parts & repair services. We also rebuild & replace camera detectors, move camera systems across town or across the country. Call us at 888-239-3510
Chest X-rays are “critical for the detection of thoracic diseases, including tuberculosis and lung cancer, which affect millions of people worldwide each year,” stated the researchers, in their paper, observing that “this time-consuming task typically requires expert radiologists to read the images, leading to fatigue-based diagnostic error and lack of diagnostic expertise in areas of the world where radiologists are not available.”
The challenge is huge. Citing the World Health Organization's estimates that more than 4 billion people lack access to medical imaging expertise, the researchers point out that even in those countries with advanced healthcare, automated chest X-ray interpretations “could be used for work-list prioritization, allowing the sickest patients to receive quicker diagnoses and treatment, even in hospital settings in which radiologists are not immediately available.”
The idea driving the present CheXNeXt efforts, said Lungren, is that, eventually, such a system could provide quality diagnostic support or “consultations” to healthcare providers without the interpretive expertise of a radiologist.
“We’re seeking opportunities to get our algorithm trained and validated in a variety of settings to explore both its strengths and blind spots,” graduate student Pranav Rajpurkar noted in the Stanford report. “The algorithm has evaluated over 100,000 X-rays so far, but now we want to know how well it would do if we showed it a million X-rays – and not just from one hospital, but from hospitals around the world.”
Lungren and co-senior author Andrew Ng, adjunct professor of Computer Science at Stanford, have been developing their diagnostic algorithm for over a year – the present iteration is built on earlier versions that beat radiologists when diagnosing pneumonia on X-ray.