Augmenting interaction effects in convolutional networks with taylor polynomial gated units.

Journal: Neural networks : the official journal of the International Neural Network Society
Published Date:

Abstract

Transformer-based vision models are often assumed to have an advantage over traditional convolutional neural networks (CNNs) due to their ability to model long-range dependencies and interactions between inputs. However, the remarkable success of pure convolutional models such as ConvNeXt, which incorporates architectural elements from Vision Transformers (ViTs), challenge the prevailing assumption about the intrinsic superiority of Transformers. In this work, we aim to explore an alternative path to efficiently express interactions between inputs without an attention module by delving into the interaction effects in ConvNeXt. This exploration leads to the proposal of a new activation function, i.e., the Taylor Polynomial Gated Unit (TPGU). The TPGU substitutes the cumulative distribution function in the Gaussian Error Linear Unit (GELU) with a learnable Taylor polynomial, so that it not only can flexibly adjust the strength of each order of interactions but also does not require additional normalization or regularization of the input and output. Comprehensive experiments demonstrate that swapping out GELUs for TPGUs notably boosts model performance under identical training settings. Moreover, empirical evidence highlights the particularly favorable impact of the TPGU on pure convolutional networks, such that it enhances the performance of ConvNeXt-T by 0.7 % on ImageNet-1K. Our findings encourage revisiting the potential utility of polynomials within contemporary neural network architectures. The code for our implementation has been made publicly available at https://github.com/LQandlq/tpgu.

Authors

  • Ligeng Zou
    Hunan Provincial Key Laboratory of Intelligent Computing and Language Information Processing, Hunan Normal University, Changsha 410081, China; College of Information Science and Engineering, Hunan Normal University, Changsha 410081, China.
  • Qi Liu
    National Institute of Traditional Chinese Medicine Constitution and Preventive Medicine, Beijing University of Chinese Medicine, Beijing, China.
  • Jianhua Dai