Please use this identifier to cite or link to this item:
|Scopus||Web of Science®||Altmetric|
|Title:||Fully automated classification of mammograms using deep residual neural networks|
|Citation:||Proceedings of the IEEE 14th International Symposium on Biomedical Imaging (ISBI 2017), 2017 / pp.310-314|
|Series/Report no.:||IEEE International Symposium on Biomedical Imaging|
|Conference Name:||IEEE 14th International Symposium on Biomedical Imaging (ISBI 2017) (18 Apr 2017 - 21 Apr 2017 : Melbourne, AUSTRALIA)|
|Neeraj Dhungel, Gustavo Carneiro, Andrew P. Bradley|
|Abstract:||In this paper, we propose a multi-view deep residual neural network (mResNet) for the fully automated classification of mammograms as either malignant or normal/benign. Specifically, our mResNet approach consists of an ensemble of deep residual networks (ResNet), which have six input images, including the unregistered craniocaudal (CC) and mediolateral oblique (MLO) mammogram views as well as the automatically produced binary segmentation maps of the masses and micro-calcifications in each view. We then form the mResNet by concatenating the outputs of each ResNet at the second to last layer, followed by a final, fully connected, layer. The resulting mResNet is trained in an end-to-end fashion to produce a case-based mammogram classifier that has the potential to be used in breast screening programs. We empirically show on the publicly available INbreast dataset, that the proposed mResNet classifies mammograms into malignant or normal/benign with an AUC of 0.8.|
|Keywords:||Mammogram; classification; multi-view; residual neural network|
|Appears in Collections:||Computer Science publications|
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.