Comput Methods Programs Biomed. 2026 Apr 2;281:109352. doi: 10.1016/j.cmpb.2026.109352. Online ahead of print.
ABSTRACT
BACKGROUND AND OBJECTIVE: Segmentation of brain vascular structures is a current challenge in radiology for the diagnosis of human vascular pathologies. Owing to the nature of cerebral vessels and to advances in supervised segmentation methods, the process of collecting a case set for segmentation with expert masks was very laborious and ambiguous. Objective analysis of automatic segmentation results remains a current, yet unaddressed challenge. To overcome these difficulties, a method for semi-automatic generating synthetic vessels within original computed tomography angiography images using expert masks was developed.
METHODS: Generating synthetic vessels that reflect real vessels enables an objective evaluation of segmentation methods and the geometric parameters of the vessels determined based on their segmentation. In this article, the results of four segmentation methods were examined based on generated vessels embedded in original images: UNETR, V-NET, nnUNET, and the classic Frangi method, which remains the baseline reference method. In addition, an analysis of selected geometric parameters of the segmented vessels was performed, including centerline distances, length, diameter, curvature and tortuosity.
RESULTS: This study investigated differences between the results of automatic segmentation of the selected arteries with reference to synthetic data. The obtained results indicate significant correlation between vessel geometric parameters and segmentation quality.
CONCLUSIONS: Even nnUNET, commonly considered the most effective vessel segmentation method, exhibits significant statistical differences in the determined vessel parameters. An objective analysis of the segmentation results and their geometric parameters, made possible by the developed vessel generation method, indicates a clear need for further development of vessel segmentation methods.
PMID:41945973 | DOI:10.1016/j.cmpb.2026.109352