AIMC Topic: Gestures

Clear Filters Showing 31 to 40 of 240 articles

Machine Learning-Assisted Gesture Sensor Made with Graphene/Carbon Nanotubes for Sign Language Recognition.

ACS applied materials & interfaces
Gesture sensors are essential to collect human movements for human-computer interfaces, but their application is normally hampered by the difficulties in achieving high sensitivity and an ultrawide response range simultaneously. In this article, insp...

Mitigating the Concurrent Interference of Electrode Shift and Loosening in Myoelectric Pattern Recognition Using Siamese Autoencoder Network.

IEEE transactions on neural systems and rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society
The objective of this work is to develop a novel myoelectric pattern recognition (MPR) method to mitigate the concurrent interference of electrode shift and loosening, thereby improving the practicality of MPR-based gestural interfaces towards intell...

Improving Hand Gesture Recognition Robustness to Dynamic Posture Variations by Multimodal Deep Feature Fusion.

IEEE transactions on neural systems and rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society
Surface electromyography (sEMG), a human-machine interface for gesture recognition, has shown promising potential for decoding motor intentions, but a variety of nonideal factors restrict its practical application in assistive robots. In this paper, ...

Human hand gesture recognition using fast Fourier transform with coot optimization based on deep neural network.

Network (Bristol, England)
Hand motion detection is particularly important for managing the movement of individuals who have limbs amputated. The existing algorithm is complex, time-consuming and difficult to achieve better accuracy. A DNN is suggested to recognize human hand ...

Differentiating hand gestures from forearm muscle activity using machine learning.

International journal of occupational safety and ergonomics : JOSE
This study explored the use of forearm electromyography data to distinguish eight hand gestures. The neural network (NN) and random forest (RF) algorithms were tested on data from 10 participants. As window sizes increase from 200 ms to 1000 ms, the ...

Enhanced Hand Gesture Recognition with Surface Electromyogram and Machine Learning.

Sensors (Basel, Switzerland)
This study delves into decoding hand gestures using surface electromyography (EMG) signals collected via a precision Myo-armband sensor, leveraging machine learning algorithms. The research entails rigorous data preprocessing to extract features and ...

Sea Horse Optimization-Deep Neural Network: A Medication Adherence Monitoring System Based on Hand Gesture Recognition.

Sensors (Basel, Switzerland)
Medication adherence is an essential aspect of healthcare for patients and is important for achieving medical objectives. However, the lack of standard techniques for measuring adherence is a global concern, making it challenging to accurately monito...

Integration of Tracking, Re-Identification, and Gesture Recognition for Facilitating Human-Robot Interaction.

Sensors (Basel, Switzerland)
For successful human-robot collaboration, it is crucial to establish and sustain quality interaction between humans and robots, making it essential to facilitate human-robot interaction (HRI) effectively. The evolution of robot intelligence now enabl...

Combinatorial Bionic Hierarchical Flexible Strain Sensor for Sign Language Recognition with Machine Learning.

ACS applied materials & interfaces
Flexible strain sensors have been widely researched in fields such as smart wearables, human health monitoring, and biomedical applications. However, achieving a wide sensing range and high sensitivity of flexible strain sensors simultaneously remain...

Coupled Multimodal Emotional Feature Analysis Based on Broad-Deep Fusion Networks in Human-Robot Interaction.

IEEE transactions on neural networks and learning systems
A coupled multimodal emotional feature analysis (CMEFA) method based on broad-deep fusion networks, which divide multimodal emotion recognition into two layers, is proposed. First, facial emotional features and gesture emotional features are extracte...