Deep Learning Techniques for Spanish Sign Language Interpretation.

Journal: Computational intelligence and neuroscience
Published Date:

Abstract

Around 5% of the world population suffers from hearing impairment. One of its main barriers is communication with others since it could lead to their social exclusion and frustration. To overcome this issue, this paper presents a system to interpret the Spanish sign language alphabet which makes the communication possible in those cases, where it is necessary to sign proper nouns such as names, streets, or trademarks. For this, firstly, we have generated an image dataset of the signed 30 letters composing the Spanish alphabet. Then, given that there are static and in-motion letters, two different kinds of neural networks have been tested and compared: convolutional neural networks (CNNs) and recurrent neural networks (RNNs). A comparative analysis of the experimental results highlights the importance of the spatial dimension with respect to the temporal dimension in sign interpretation. So, CNNs obtain a much better accuracy, with 96.42% being the maximum value.

Authors

  • Ester Martinez-Martin
    Institute for Computer Research, University of Alicante, P. O. Box 99. 03080, Alicante, Spain.
  • Francisco Morillas-Espejo
    Department of Computer Science and Artificial Intelligence, University of Alicante, E-03690 San Vicente del Raspeig, Alicante, Spain.