Zongwei Zhou, a member of the Malone Center and an associate research scientist in the Whiting School of Engineering’s Department of Computer Science, has received a four-year, $2.8 million R01 grant from the National Institutes of Health to develop an AI system to enhance the detection and monitoring of metastasis in colorectal cancer using patients’ CT scans.
Colorectal cancer (CRC) is the third most common cancer and the second leading cause of cancer-related deaths in the U.S. Approximately 20% of CRC patients already show signs of metastasis—when cancer spreads to other parts of the body—at diagnosis, making the early detection of this disease crucial. Currently, computed tomography (CT) scans serve as the primary imaging tool for determining how far the cancer has spread, monitoring its progression, and evaluating treatment efficacy.
However, accurately interpreting CT images for metastasis early on can be challenging, requiring radiologists to meticulously review images across multiple time points, perform measurements, and provide detailed reports—all repetitive tasks that are also cognitively demanding and time-consuming. Radiologists’ increasing workloads and time pressure lead to high rates of burnout, increased diagnostic errors, and compromised quality of patient care, illustrating the pressing need for an innovation in this area.
Joined by Yang Yang and Kang Wang, both faculty in the Department of Radiology and Biomedical Imaging at the University of California San Francisco School of Medicine, Zhou proposes leveraging AI to develop a comprehensive system that is capable of meticulously analyzing multiple abdominal CT scans across time. The team’s system could enable the earlier detection of subtle changes that may elude human experts while also reducing radiologists’ cognitive load and allowing them to focus on more complex cases and comprehensive patient assessments.
“This project could transform how we detect metastatic colorectal cancer by using AI to spot the subtlest signs on routine CT scans, long before symptoms appear, and ultimately giving patients a better shot at early, lifesaving care,” Zhou says.
The researchers plan to build a large-scale standardized imaging and report database by employing large language models to automate data extraction from radiology reports, along with an anatomy-aware vision-language AI model for detecting metastatic lesions, tracking them over time, and generating structured reports—all without extensive manual data curation or annotation. They plan to conduct prospective studies to evaluate the AI system’s performance in real-world clinical settings.
“We additionally look forward to collaborating with researchers from other institutions to support the rigorous external validation of our system,” Zhou adds.
Other co-investigators on this project include Bloomberg Distinguished Professor of Computational Cognitive Science Alan Yuille and UCSF Department of Radiology and Biomedical Imaging faculty Michael Ohlinger and Hong Chen.