Fusing Heterogeneous Features From Stacked Sparse Autoencoder for Histopathological Image Analysis.
Journal:
IEEE journal of biomedical and health informatics
Published Date:
Jul 29, 2015
Abstract
In the analysis of histopathological images, both holistic (e.g., architecture features) and local appearance features demonstrate excellent performance, while their accuracy may vary dramatically when providing different inputs. This motivates us to investigate how to fuse results from these features to enhance the accuracy. Particularly, we employ content-based image retrieval approaches to discover morphologically relevant images for image-guided diagnosis, using holistic and local features, both of which are generated from the cell detection results by a stacked sparse autoencoder. Because of the dramatically different characteristics and representations of these heterogeneous features (i.e., holistic and local), their results may not agree with each other, causing difficulties for traditional fusion methods. In this paper, we employ a graph-based query-specific fusion approach where multiple retrieval results (i.e., rank lists) are integrated and reordered based on a fused graph. The proposed method is capable of combining the strengths of local or holistic features adaptively for different inputs. We evaluate our method on a challenging clinical problem, i.e., histopathological image-guided diagnosis of intraductal breast lesions, and it achieves 91.67% classification accuracy on 120 breast tissue images from 40 patients.