AIMC Topic: Sign Language

Clear Filters Showing 31 to 40 of 51 articles

Spatial Attention-Based 3D Graph Convolutional Neural Network for Sign Language Recognition.

Sensors (Basel, Switzerland)
Sign language is the main channel for hearing-impaired people to communicate with others. It is a visual language that conveys highly structured components of manual and non-manual parameters such that it needs a lot of effort to master by hearing pe...

Hypertuned Deep Convolutional Neural Network for Sign Language Recognition.

Computational intelligence and neuroscience
Sign language plays a pivotal role in the lives of impaired people having speaking and hearing disabilities. They can convey messages using hand gesture movements. American Sign Language (ASL) recognition is challenging due to the increasing intra-cl...

Real-Time Hand Gesture Recognition Using Fine-Tuned Convolutional Neural Network.

Sensors (Basel, Switzerland)
Hand gesture recognition is one of the most effective modes of interaction between humans and computers due to being highly flexible and user-friendly. A real-time hand gesture recognition system should aim to develop a user-independent interface wit...

Bangla Sign Language (BdSL) Alphabets and Numerals Classification Using a Deep Learning Model.

Sensors (Basel, Switzerland)
A real-time Bangla Sign Language interpreter can enable more than 200 k hearing and speech-impaired people to the mainstream workforce in Bangladesh. Bangla Sign Language (BdSL) recognition and detection is a challenging topic in computer vision and ...

AI enabled sign language recognition and VR space bidirectional communication using triboelectric smart glove.

Nature communications
Sign language recognition, especially the sentence recognition, is of great significance for lowering the communication barrier between the hearing/speech impaired and the non-signers. The general glove solutions, which are employed to detect motions...

Artificial Intelligence Technologies for Sign Language.

Sensors (Basel, Switzerland)
AI technologies can play an important role in breaking down the communication barriers of deaf or hearing-impaired people with other communities, contributing significantly to their social inclusion. Recent advances in both sensing technologies and A...

Deep Learning Techniques for Spanish Sign Language Interpretation.

Computational intelligence and neuroscience
Around 5% of the world population suffers from hearing impairment. One of its main barriers is communication with others since it could lead to their social exclusion and frustration. To overcome this issue, this paper presents a system to interpret ...

Wearable Sensor-Based Sign Language Recognition: A Comprehensive Review.

IEEE reviews in biomedical engineering
Sign language is used as a primary form of communication by many people who are Deaf, deafened, hard of hearing, and non-verbal. Communication barriers exist for members of these populations during daily interactions with those who are unable to unde...

Sensor Fusion of Motion-Based Sign Language Interpretation with Deep Learning.

Sensors (Basel, Switzerland)
Sign language was designed to allow hearing-impaired people to interact with others. Nonetheless, knowledge of sign language is uncommon in society, which leads to a communication barrier with the hearing-impaired community. Many studies of sign lang...

Recognition of Non-Manual Content in Continuous Japanese Sign Language.

Sensors (Basel, Switzerland)
The quality of recognition systems for continuous utterances in signed languages could be largely advanced within the last years. However, research efforts often do not address specific linguistic features of signed languages, as e.g., non-manual exp...