PROJECT TITLE :
Deeply-Supervised Networks With Threshold Loss for Cancer Detection in Automated Breast Ultrasound
Automated breast ultrasound, often known as ABUS, is a novel and promising screening technique for the breasts. Instead of relying on the operator to capture images as is the case with traditional 2D B-mode ultrasound, ABUS is able to capture images without the need for a human operator. However, the process of examining ABUS photographs is extremely time-consuming, and mistakes may be made due to oversight. In order to speed up the review process while still achieving high detection sensitivity and low false positives, we've developed a new 3D convolutional network for use in this study (FPs). By utilising multi-layer features efficiently, we present a densely deep supervision method that significantly increases detection sensitivity. To help distinguish between cancer and non-cancer at the voxel level, we propose that a threshold loss be used, which can yield high sensitivity and low false positives. Data from 219 patients, 614 ABUS volumes, 745 cancer regions, and 144 healthy women, all totaling over 900 volumes, has been used to verify the effectiveness of our network. With 0.84 FP per volume, extensive testing has shown that our approach is 95 percent sensitive. The suggested network offers a breast cancer detection strategy using ABUS that maintains high sensitivity with low false positives. It's an effective system. If you're interested, you may get the source code at https://github.com/nawang0226/abus code.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here