Cx22: A new publicly available dataset for deep learning-based segmentation of cervical cytology images.

Journal: Computers in biology and medicine
Published Date:

Abstract

The segmentation of cervical cytology images plays an important role in the automatic analysis of cervical cytology screening. Although deep learning-based segmentation methods are well-developed in other image segmentation areas, their application in the segmentation of cervical cytology images is still in the early stage. The most important reason for the slow progress is the lack of publicly available and high-quality datasets, and the study on the deep learning-based segmentation methods may be hampered by the present datasets which are either artificial or plagued by the issue of false-negative objects. In this paper, we develop a new dataset of cervical cytology images named Cx22, which consists of the completely annotated labels of the cellular instances based on the open-source images released by our institute previously. Firstly, we meticulously delineate the contours of 14,946 cellular instances in1320 images that are generated by our proposed ROI-based label cropping algorithm. Then, we propose the baseline methods for the deep learning-based semantic and instance segmentation tasks based on Cx22. Finally, through the experiments, we validate the task suitability of Cx22, and the results reveal the impact of false-negative objects on the performance of the baseline methods. Based on our work, Cx22 can provide a foundation for fellow researchers to develop high-performance deep learning-based methods for the segmentation of cervical cytology images. Other detailed information and step-by-step guidance on accessing the dataset are made available to fellow researchers at https://github.com/LGQ330/Cx22.

Authors

  • Guangqi Liu
    Key Laboratory of Opto-Electronic Information Processing, Chinese Academy of Sciences, Shenyang, 110016, China; Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang, 110016, China; Institutes for Robotics and Intelligent Manufacturing, Chinese Academy of Sciences, Shenyang, 110169, China; University of Chinese Academy of Sciences, Beijing, 100049, China. Electronic address: liuguangqi@sia.cn.
  • Qinghai Ding
    Space Star Technology Co, Ltd., Beijing, 100086, China. Electronic address: dingqinghai@sia.cn.
  • Haibo Luo
    Key Laboratory of Opto-Electronic Information Processing, Chinese Academy of Sciences, Shenyang, 110016, China; Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang, 110016, China; Institutes for Robotics and Intelligent Manufacturing, Chinese Academy of Sciences, Shenyang, 110169, China. Electronic address: luohb@sia.cn.
  • Min Sha
    Archives of NEU, Northeastern University, Shenyang, 110819, China. Electronic address: sham@mail.neu.edu.cn.
  • Xiang Li
    Department of Radiology, Massachusetts General Hospital and Harvard Medical School, Boston, MA, United States.
  • Moran Ju
    College of Information Science and Technology, Dalian Maritime University, Dalian, 116026, China. Electronic address: jumoran@dlmu.edu.cn.