An Automatic Method for Locating Positions and their Colors Important for Classifying Genders in Retinal Fundus Images by Deep Learning Models.

Journal: Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual International Conference
PMID:

Abstract

This paper proposes an automatic method to identify important positions and their color features in retinal fundus images for gender classification using deep learning. The proposed method consists of MALCC (Model Analysis by Local Color Characteristics) and U-test. By partitioning each YCbCr image in the dataset into blocks and randomly masking them, the deep learning model predicts gender probabilities. Multiple regression analysis for the probabilities and masked blocks identifies significant blocks and colors, which are visualized in a Total Colormap. The Cr and Cb values of these blocks are then analyzed using a U-test to check if significant differences exist between classes. As a result of experiments, important blocks (positions) and their colors in the images are identified. Our code(RGB version) is available at https://github.com/tsutsu-22/MALCC.

Authors

  • Shota Tsutsui
  • Ichiro Maruko
    Department of Ophthalmology, Tokyo Women's Medical University, Shinjuku, Tokyo, Japan.
  • Moeko Kawai
  • Yoichi Kato
  • Jun Ohya