Computational reconstruction of mental representations using human behavior.

Journal: Nature communications
PMID:

Abstract

Revealing how the mind represents information is a longstanding goal of cognitive science. However, there is currently no framework for reconstructing the broad range of mental representations that humans possess. Here, we ask participants to indicate what they perceive in images made of random visual features in a deep neural network. We then infer associations between the semantic features of their responses and the visual features of the images. This allows us to reconstruct the mental representations of multiple visual concepts, both those supplied by participants and other concepts extrapolated from the same semantic space. We validate these reconstructions in separate participants and further generalize our approach to predict behavior for new stimuli and in a new task. Finally, we reconstruct the mental representations of individual observers and of a neural network. This framework enables a large-scale investigation of conceptual representations.

Authors

  • Laurent Caplette
    Department of Psychology, Yale University, New Haven, CT, USA. laurent.caplette@yale.edu.
  • Nicholas B Turk-Browne
    Princeton Neuroscience Institute and Department of Psychology, Princeton, NJ 08544, USA.