AI Medical Compendium Topic

Explore the latest research on artificial intelligence and machine learning in medicine.

Sign Language

Showing 31 to 40 of 46 articles

Clear Filters

Dynamic Hand Gesture Recognition Based on a Leap Motion Controller and Two-Layer Bidirectional Recurrent Neural Network.

Sensors (Basel, Switzerland)
Dynamic hand gesture recognition is one of the most significant tools for human-computer interaction. In order to improve the accuracy of the dynamic hand gesture recognition, in this paper, a two-layer Bidirectional Recurrent Neural Network for the ...

Sign Language Recognition Using Wearable Electronics: Implementing k-Nearest Neighbors with Dynamic Time Warping and Convolutional Neural Network Algorithms.

Sensors (Basel, Switzerland)
We propose a sign language recognition system based on wearable electronics and two different classification algorithms. The wearable electronics were made of a sensory glove and inertial measurement units to gather fingers, wrist, and arm/forearm mo...

British Sign Language Recognition via Late Fusion of Computer Vision and Leap Motion with Transfer Learning to American Sign Language.

Sensors (Basel, Switzerland)
In this work, we show that a late fusion approach to multimodality in sign language recognition improves the overall ability of the model in comparison to the singular approaches of image classification (88.14%) and Leap Motion data classification (7...

Recognition of Non-Manual Content in Continuous Japanese Sign Language.

Sensors (Basel, Switzerland)
The quality of recognition systems for continuous utterances in signed languages could be largely advanced within the last years. However, research efforts often do not address specific linguistic features of signed languages, as e.g., non-manual exp...

Sensor Fusion of Motion-Based Sign Language Interpretation with Deep Learning.

Sensors (Basel, Switzerland)
Sign language was designed to allow hearing-impaired people to interact with others. Nonetheless, knowledge of sign language is uncommon in society, which leads to a communication barrier with the hearing-impaired community. Many studies of sign lang...

Deep Learning Techniques for Spanish Sign Language Interpretation.

Computational intelligence and neuroscience
Around 5% of the world population suffers from hearing impairment. One of its main barriers is communication with others since it could lead to their social exclusion and frustration. To overcome this issue, this paper presents a system to interpret ...

Wearable Sensor-Based Sign Language Recognition: A Comprehensive Review.

IEEE reviews in biomedical engineering
Sign language is used as a primary form of communication by many people who are Deaf, deafened, hard of hearing, and non-verbal. Communication barriers exist for members of these populations during daily interactions with those who are unable to unde...

Artificial Intelligence Technologies for Sign Language.

Sensors (Basel, Switzerland)
AI technologies can play an important role in breaking down the communication barriers of deaf or hearing-impaired people with other communities, contributing significantly to their social inclusion. Recent advances in both sensing technologies and A...

AI enabled sign language recognition and VR space bidirectional communication using triboelectric smart glove.

Nature communications
Sign language recognition, especially the sentence recognition, is of great significance for lowering the communication barrier between the hearing/speech impaired and the non-signers. The general glove solutions, which are employed to detect motions...

Bangla Sign Language (BdSL) Alphabets and Numerals Classification Using a Deep Learning Model.

Sensors (Basel, Switzerland)
A real-time Bangla Sign Language interpreter can enable more than 200 k hearing and speech-impaired people to the mainstream workforce in Bangladesh. Bangla Sign Language (BdSL) recognition and detection is a challenging topic in computer vision and ...