Photography by Steven Bridges & Lars Berg
Illustration by Joseph Schmidt-Klingenberg
The clinical application—housed within the syngo®.via for Molecular Imaging reading solution—sits at the interface of medicine and artificial intelligence, and opens the door to a future in which care is delivered more efficiently and precisely. The power of Auto ID lies in its potential to dramatically speed workflow for physicians by automatically segmenting and classifying the uptake of 18F-FDG in whole-body PET/CT images as either pathologic or physiologic. Auto ID also enables physicians to calculate whole-body metabolic tumor volume (MTV) and total lesion glycolysis (TLG) within seconds.[b]
“For me, one of the motivations is the potential impact of this type of technology," says Ludovic Sibille, MS, the senior scientist at Siemens Healthineers who developed the algorithm that powers Auto ID. "It has a wide application and a large impact potentially on our customers and on radiology.”
A collaborative journey in nuclear medicine
“For me, one of the motivation is the potential impact of this type of technology. It has a wide application and a large impact potentially on our customers and on radiology.”
“There’s this ‘aha’ moment when they would open up their eyes and say, ‘You just identified uptake that would need to be included or excluded, with a little help from me, in less than two minutes,'" he recalls, adding that the current iteration of Auto ID has brought the time down to approximately 10 seconds.[c] “They told us pretty directly that Auto ID will be something that they could use every day.”
Artificial intelligence in nuclear medicine grounded in clinical expertise
Through an iterative process, Sibille worked to refine the parameters that comprise the algorithm. He trained, validated, and tested the algorithm against data grounded in the clinical expertise of Michael Schäfers, MD, and his colleagues at the University of Münster. To create a reference standard to train, validate, and test the algorithm, a board-certified nuclear medicine physician and a board-certified radiologist from the University of Münster analyzed whole-body PET/CT scans of 629 patients with lung cancer or lymphoma. The experts manually delineated foci with increased 18F-FDG uptake and performed more than 12,000 annotations. Like Sibille, Schäfers says he was motivated by the opportunity to have a meaningful impact on the future of medicine.
“There’s hype about artificial intelligence, but most of the approaches suffer from poor data quality or not enough data,” Schäfers says. “If you want to use AI as an expert system, to train people, to support people in their decisions, you have to make sure that the data, the ground truth, is not wrong from the beginning.”
“There’s hype about artificial intelligence, but most of the approaches suffer from poor data quality or not enough data. If you want to use AI as an expert system, to train people, to support people in their decisions, you have to make sure that the data, the ground truth, is not wrong from the beginning.”
Additional retrospective studies have continued to evaluate Auto ID’s ability to quantify MTV in lymphoma, breast cancer, and other miscellaneous cancer types. The studies reinforce the workflow of Auto ID when compared to manual segmentation and quantification efforts. The ability of Auto ID to assist in the segmentation and quantification of MTV TLG underscores the prognostic value in the ability to predict overall- and progression-free survival.2-4
“The future of our field”
“The Auto ID functionality and similar approaches are the future of our field.”
Seifert and Schäfers point out several additional ways in which artificial intelligence has the potential to enhance clinical care and basic research, including enabling more nuanced staging of cancer, the analysis of dynamic images, and the integration of imaging data with population-level datasets to advance research. “The Auto ID functionality and similar approaches are the future of our field,” Seifert says.
“AI, if it’s truly meaningful, needs to be almost invisible. Don’t change the reader’s method—support it, add to it, augment it, but don’t change it.”
“AI, if it’s truly meaningful, needs to be almost invisible,” von Gall says. “Don’t change the reader’s method—support it, add to it, augment it, but don’t change it. So, that’s why the only thing that you’re going to see from Auto ID in the configuration is one checkbox, which says ‘Enable Auto ID.’ And that’s where the magic happens.”
About the author
Sameh Fahmy, MS, is an award-winning freelance medical and technology journalist based in Athens, Georgia, USA.