Deep-learning-based multi-class segmentation for automated, non-invasive routine assessment of human pluripotent stem cell culture status.

Journal: Computers in biology and medicine
PMID:

Abstract

Human induced pluripotent stem cells (hiPSCs) are capable of differentiating into a variety of human tissue cells. They offer new opportunities for personalized medicine and drug screening. This requires large quantities of high quality hiPSCs, obtainable only via automated cultivation. One of the major requirements of an automated cultivation is a regular, non-invasive analysis of the cell condition, e.g. by whole-well microscopy. However, despite the urgency of this requirement, there are currently no automatic, image-processing-based solutions for multi-class routine quantification of this nature. This paper describes a method to fully automate the cell state recognition based on phase contrast microscopy and deep-learning. This approach can be used for in process control during an automated hiPSC cultivation. The U-Net based algorithm is capable of segmenting important parameters of hiPSC colony formation and can discriminate between the classes hiPSC colony, single cells, differentiated cells and dead cells. The model achieves more accurate results for the classes hiPSC colonies, differentiated cells, single hiPSCs and dead cells than visual estimation by a skilled expert. Furthermore, parameters for each hiPSC colony are derived directly from the classification result such as roundness, size, center of gravity and inclusions of other cells. These parameters provide localized information about the cell state and enable well based treatment of the cell culture in automated processes. Thus, the model can be exploited for routine, non-invasive image analysis during an automated hiPSC cultivation. This facilitates the generation of high quality hiPSC derived products for biomedical purposes.

Authors

  • Tobias Piotrowski
    Fraunhofer Institute for Production Technology IPT, Aachen, Germany. Electronic address: tobias.piotrowski@ipt.fraunhofer.de.
  • Oliver Rippel
    Institute of Imaging and Computer Vision, RWTH Aachen University, Aachen, Germany.
  • Andreas Elanzew
    Life & Brain GmbH, Cellomics Unit, Bonn, Germany; Institute of Reconstructive Neurobiology, University of Bonn Medical Faculty &University Hospital Bonn, Bonn, Germany.
  • Bastian Nießing
    Fraunhofer Institute for Production Technology IPT, Aachen, Germany.
  • Sebastian Stucken
    Fraunhofer Institute for Production Technology IPT, Aachen, Germany.
  • Sven Jung
    Fraunhofer Institute for Production Technology IPT, Aachen, Germany.
  • Niels König
    Fraunhofer Institute for Production Technology IPT, Aachen, Germany.
  • Simone Haupt
    Life & Brain GmbH, Cellomics Unit, Bonn, Germany.
  • Laura Stappert
    Life & Brain GmbH, Cellomics Unit, Bonn, Germany.
  • Oliver Brüstle
    Life & Brain GmbH, Cellomics Unit, Bonn, Germany; Institute of Reconstructive Neurobiology, University of Bonn Medical Faculty &University Hospital Bonn, Bonn, Germany.
  • Robert Schmitt
    WZL Aachen, RWTH Aachen, Aachen, Germany.
  • Stephan Jonas
    Department of Medical Informatics, RWTH Aachen University, Pauwelsstr. 30, 52057 Aachen, Germany.