S. Albarqouni, C. Baur, F. Achilles, V. Belagiannis, S. Demirci, N. Navab
AggNet: Deep Learning from Crowds for Mitosis Detection in Breast Cancer Histology Images IEEE Transactions on Medical Imaging (TMI), Special Issue on Deep Learning, vol. 35, no. 5, pp. 1313 - 1321, 2016. The first two authors contribute equally to this paper. (bib) |
||
The lack of publicly available ground-truth data has been identified as the major challenge for transferring recent developments in deep learning to the biomedical imaging domain. Though crowdsourcing has enabled annotation of large scale databases for real world images, its application for biomedical purposes requires a deeper understanding and hence, more precise definition of the actual annotation task. The fact that expert tasks are being outsourced to non-expert users may lead to noisy annotations introducing disagreement between users. Despite being a valuable resource for learning annotation models from crowdsourcing, conventional machine-learning methods may have difficulties dealing with noisy annotations during training. In this manuscript, we present a new concept for learning from crowds that handle data aggregation directly as part of the learning process of the convolutional neural network (CNN) via additional crowdsourcing layer (AggNet?). Besides, we present an experimental study on learning from crowds designed to answer the following questions: (i) Can deep CNN be trained with data collected from crowdsourcing?, (ii) How to adapt the CNN to train on multiple types of annotation datasets (ground truth and crowd-based)?, (iii) How does the choice of annotation and aggregation affect the accuracy? Our experimental setup involved Annot8, a self-implemented web-platform based on Crowdflower API realizing image annotation tasks for a publicly available biomedical image database. Our results give valuable insights into the functionality of deep CNN learning from crowd annotations and prove the necessity of data aggregation integration. | ||
This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each authors copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder. |