Dual coding of knowledge in the human brain.

Journal: Trends in cognitive sciences
PMID:

Abstract

How does the human brain code knowledge about the world? While disciplines such as artificial intelligence represent world knowledge based on human language, neurocognitive models of knowledge have been dominated by sensory embodiment, in which knowledge is derived from sensory/motor experience and supported by high-level sensory/motor and association cortices. The neural correlates of an alternative disembodied symbolic system had previously been difficult to establish. A recent line of studies exploring knowledge about visual properties, such as color, in visually deprived individuals converge to provide positive, compelling evidence for non-sensory, language-derived, knowledge representation in dorsal anterior temporal lobe and extended language network, in addition to the sensory-derived representations, leading to a sketch of a dual-coding knowledge neural framework.

Authors

  • Yanchao Bi
    State Key Laboratory of Cognitive Neuroscience and Learning & IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing, 100875, China; Beijing Key Laboratory of Brain Imaging and Connectomics, Beijing Normal University, Beijing, 100875, China. Electronic address: ybi@bnu.edu.cn.