Self-Supervised Neuron Morphology Representation with Graph Transformer.
Journal:
IEEE transactions on medical imaging
Published Date:
Jul 18, 2025
Abstract
Effective representation of neuronal morphology is essential for cell typing and understanding brain function. However, the complexity of neuronal morphology arises not only in inter-class structural differences but also in intra-class variations across developmental stages and environmental conditions. Such diversity poses significant challenges for existing methods in balancing robustness and discriminative power when representing neuronal morphology. To address this, we propose SGTMorph, a hybrid Graph Transformer framework that leverages the local topological modeling capabilities of graph neural networks and the global relational reasoning strengths of Transformers to explicitly encode neuronal structural information. SGTMorph incorporates a random walk-based positional encoding scheme to facilitate effective information propagation across neuronal graphs and introduces a spatially invariant encoding mechanism to improve adaptability to diverse morphologies. This integrated approach enables a robust and comprehensive representation of neuronal morphology while preserving biological fidelity. To enable label-free feature learning, we devise a self-supervised training strategy grounded in geometric and topological similarity metrics. Extensive experiments on five datasets demonstrate SGTMorph's superior performance in neuron morphology classification and retrieval tasks. Furthermore, its practical utility in neuroscience research is validated by accurate predictions of two functional properties: the laminar distribution of somas and axonal projection patterns. The code is publicly at: https://github.com/big-rain/SGTMorph.
Authors
Keywords
No keywords available for this article.