project spotlight | 18 September 2023

AI-Enabled Microscopes Demonstrate the Potential for More Timely and Accurate Cancer Detection

ARM header photo
A pathologist carefully examines a tissue sample using the Augmented Reality Microscope (ARM). The monitor in the background is displaying the sample in realtime and the green inference is highlighting suspected cancer.

Department of Veterans Affairs pathologists assisted by AI provide more accurate cancer diagnosis; Potential to scale technology across the U.S. Department of Defense and to civilian medicine

MOUNTAIN VIEW, Calif (September 2023)—The U.S. Department of Defense (DoD) has pioneered many medical and care delivery breakthroughs, including blood transfusions, blood banking, adhesives to seal wounds, and blood clotting bandages. Most recently, DoD is taking the lead on another medical frontier: artificial intelligence (AI) tools for early cancer detection. The Defense Innovation Unit’s (DIU’s) Predictive Health (PH) program aims to prototype and field AI solutions that will help medical experts transform and improve military health care. 

Approximately $1.7 billion of the DoD’s annual budget is spent on cancer (paper, CPI inflation), and that figure continues to grow. Currently, the final evaluation of a suspected cancer, the final diagnosis of cancer or not,  is made by a pathologist. The current diagnostic workflow, which involves microscopic review of the biopsy sample by a pathologist, is both time-consuming and open to error. To make matters worse, the number of healthcare specialists qualified to perform this evaluation is declining, a trend that is occurring throughout the U.S. healthcare system. More importantly, there is technology available that can help healthcare professionals maintain standards of operation (citation).

Since launching in 2020, the PH program generated four new machine learning algorithms that are capable of identifying cancer within digitized pathology slides. Initial findings point to  AI-enabled technologies like the augmented reality microscope (ARM) having the potential to increase both diagnostic accuracy and efficiency of cancer detection. Pathologists assisted by AI and  machine learning (ML) have been found to have higher accuracy, sensitivity, and speed of diagnosis, than those unassisted in identifying micrometastases on digitized slides of lymph node biopsies (citation). Along with breast cancer, new AI and ML models have also demonstrated highly predictive capability in detection of prostate cancer within whole slide images. According to the Centers for Disease Control, breast cancer is the second most common cancer among women, while prostate cancer is the second most common cancer among men.

Snapshot of the ARM monitor display. The left quadrant of the display shows the inference highlighting suspected cancer on a tissue sample. The right side is a heat map of the same tissue sample for additional cueing.

"Doctors have used microscopes for over 100 years. Along with anesthetics and antibiotics, microscopes are one of the three great innovations that transformed medicine to a modern, safe, scientific profession,” said Dr. Niels Olson, chief  medical officer at DIU. “Full digitization of microscopy has lagged due to the enormous amount of data. Instead, the ARM is the self-driving car of microscopes: it doesn't need to push all the data to the cloud. It can make inferences in real time, in the doctor's lab, even offline, taking the technology to the next level."

This has been a collaboration between DIU, the Department of Veterans Affairs (VA) and VA Ventures, an innovations team at Veterans Affairs (VA) Puget Sound Health Care System, along with partners Jenoptik, Google Public Sector, and DoD’s Chief Digital and Artificial Intelligence Office. The AI and ML tools have been incorporated into the pathologist workflow at ten Military Treatment Facilities (MTF) and VA hospitals within the U.S. and abroad for an IRB-approved study to test and obtain feedback from pathologists. The microscope overlays inferences from the algorithms onto the microscopic field of view and a nearby monitor in real time, and is provided to the pathologist as decision support.

“Google Public Sector is proud to help DIU leverage AI to advance early cancer detection,” said Aashima Gupta, global director of Healthcare Strategy and Solutions, Google Cloud. “Our collaboration with DIU gives pathologists an AI assistant that helps them deliver more accurate and timely cancer diagnosis, transforming the healthcare experience for the military community and beyond.”

Thirteen ARMs are helping the team further validate the effectiveness of the AI models, by exposing them to a variety of users and case mixes in a second IRB-approved study. In addition, feedback from pathologists with different backgrounds is yielding tremendously valuable data and helping refine the models so the ARM can mature into a full-enterprise capability.

The Augmented Reality Microscope (ARM), when paired with the models, uses a high-resolution camera and augmented reality display to cast an inference around pathology specimens it detects abnormalities in.

“Cutting-edge technology, through collaboration between public-private partners, brings significant positive disruption at scale, ensuring reliable, time-efficient, cost-effective diagnostic care for our active service members, veterans and public at large,” said Dr. Nadeem Zafar, VA Puget Sound Health Care System Director for Pathology and Laboratory Medicine Service.

In high-stakes, life or death areas like defense, transportation, and healthcare, assuring of system performance is critical. The DoD team recently published an independent assessment of the first deep learning model deployed on the microscope in the Journal of Pathology Informatics. This AI and ML model identifies metastatic breast cancer in lymph nodes. The findings provide a degree of assurance of the deep learning model and the report represents a rigorous evaluation effort for AI in healthcare.  

As with any new technology, data privacy is paramount. This system currently relies on internal compute capacity in order to work offline.  The AI and ML models were trained on de-identified data and the system doesn’t record or store any type of personally identifiable information. 

Based on analysis of data from the microscope, and existing literature on AI-based cancer detection on WSI imaging, the top three takeaways are:

  • Accuracy and Performance: Algorithm-assisted pathologists have been found to have higher accuracy, sensitivity, and speed than unassisted pathologists in the identification of micrometastases on digitized slides of lymph node biopsies. The model demonstrated very accurate performance on the classification of cancer within WSI; Achieving a slide level AUC of 99%.

  • Image Review/Comparison: Pathologists (subjectively) considered evaluation of images for micrometastases to be significantly easier with algorithm assistance than without.

  • Build New Capabilities into Existing Workflow: Building ML models into microscopes is key to providing these new capabilities to pathologists where they are. This is because, most pathology practices continue to utilize glass slides rather than fully digitized workflows and, many pathologists prefer using microscopes. Even during the pandemic, when necessity pushed certain clinical modalities to rapidly adopt remote capabilities, most pathologists did not turn to remote sign-out, which depends on whole slide imaging digital pathology, they preferred to continue their hands-on workflow centered around the microscope. 

While the initial results with this project are promising, there is still critical work that must be done to accelerate adoption of  this technology. The DIU team is working with Jenoptik to offer ARMs for purchase through the General Services Administration for government procurement as a research instrument by the  summer of 2023. This will enable wider adoption that is necessary to sustain further testing and evaluation.