AIMC Topic: Gestures

Clear Filters Showing 11 to 20 of 246 articles

Efficient control of spider-like medical robots with capsule neural networks and modified spring search algorithm.

Scientific reports
This study introduces an innovative method for gesture recognition in medical robotics, utilizing Capsule Neural Networks (CNNs) in conjunction with the Modified Spring Search Algorithm (MSSA). This approach achieves remarkable efficiency in gesture ...

Understanding Robot Gesture Perception in Children with Autism Spectrum Disorder during Human-Robot Interaction.

International journal of neural systems
Social robots are increasingly being used in therapeutic contexts, especially as a complement in the therapy of children with Autism Spectrum Disorder (ASD). Because of this, the aim of this study is to understand how children with ASD perceive and i...

YOLOv8-G2F: A portable gesture recognition optimization algorithm.

Neural networks : the official journal of the International Neural Network Society
Hand gesture recognition (HGR) is a significant research area with applications in human-computer interaction, artificial intelligence, and more. In the early stage of development of HGR, there are high hardware costs and large usage requirements. To...

sEMG-Based Gesture Recognition via Multi-Feature Fusion Network.

IEEE journal of biomedical and health informatics
The sparse surface electromyography-based gesture recognition suffers from the problems of feature information not richness and poor generalization to small sample data. Therefore, a multi-feature fusion network (MFF-Net) model is proposed in this pa...

Real-Time American Sign Language Interpretation Using Deep Learning and Keypoint Tracking.

Sensors (Basel, Switzerland)
Communication barriers pose significant challenges for the Deaf and Hard-of-Hearing (DHH) community, limiting their access to essential services, social interactions, and professional opportunities. To bridge this gap, assistive technologies leveragi...

Innovative hand pose based sign language recognition using hybrid metaheuristic optimization algorithms with deep learning model for hearing impaired persons.

Scientific reports
Sign language (SL) is an effective mode of communication, which uses visual-physical methods like hand signals, expressions, and body actions to communicate between the difficulty of hearing and the deaf community, produce opinions, and carry signifi...

Gesture Recognition Achieved by Utilizing LoRa Signals and Deep Learning.

Sensors (Basel, Switzerland)
This study proposes a novel gesture recognition system based on LoRa technology, integrating advanced signal preprocessing, adaptive segmentation algorithms, and an improved SS-ResNet50 deep learning model. Through the combination of residual learnin...

Eye-gesture control of computer systems via artificial intelligence.

F1000Research
BACKGROUND: Artificial Intelligence (AI) offers transformative potential for human-computer interaction, particularly through eye-gesture recognition, enabling intuitive control for users and accessibility for individuals with physical impairments.

IoT-driven smart assistive communication system for the hearing impaired with hybrid deep learning models for sign language recognition.

Scientific reports
Deaf and hard-of-hearing people utilize sign language recognition (SLR) to interconnect. Sign language (SL) is vital for hard-of-hearing and deaf individuals to communicate. SL uses varied hand gestures to speak words, sentences, or letters. It aids ...

Convolutional neural network for gesture recognition human-computer interaction system design.

PloS one
Gesture interaction applications have garnered significant attention from researchers in the field of human-computer interaction due to their inherent convenience and intuitiveness. Addressing the challenge posed by the insufficient feature extractio...