by
John R. Fischer, Senior Reporter | October 20, 2020
They reason that researchers are more focused on publishing their findings than spending time and resources to ensure they can be replicated. They also say that the researchers who develop AI solutions dictate the terms of sharing and write up the informed consent of patients, raising questions about how well informed patients are about the sharing of data.
“The concern stated about privacy attacks against the learned parameters of a deep learning model could not reveal more than what went into the model, which is a mammogram, and whether radiologists identified that image as containing cancer cells,” said Waldron. “It takes a lot of twisting of the imagination to come up with any scenario where that could affect any patient volunteer.”
Ad Statistics
Times Displayed: 37156
Times Visited: 625 Reveal Mobi Pro integrates the Reveal 35C detector with SpectralDR technology into a modern mobile X-ray solution. Mobi Pro allows for simultaneous acquisition of conventional & dual-energy images with a single exposure. Contact us for a demo at no cost.
He and his colleagues add that third-party validation is essential to ensuring AI solutions are assessed in an unbiased manner. This can be done through containerization (bundling an application together with all of its related configuration files); platforms specifically for sharing AI systems; and cloud platforms with authentication, they say.
The critique was published as an opinion piece, also in
Nature.
Back to HCB News