AIMC Topic: Gestures

Clear Filters Showing 41 to 50 of 246 articles

Differentiating hand gestures from forearm muscle activity using machine learning.

International journal of occupational safety and ergonomics : JOSE
This study explored the use of forearm electromyography data to distinguish eight hand gestures. The neural network (NN) and random forest (RF) algorithms were tested on data from 10 participants. As window sizes increase from 200 ms to 1000 ms, the ...

Enhanced Hand Gesture Recognition with Surface Electromyogram and Machine Learning.

Sensors (Basel, Switzerland)
This study delves into decoding hand gestures using surface electromyography (EMG) signals collected via a precision Myo-armband sensor, leveraging machine learning algorithms. The research entails rigorous data preprocessing to extract features and ...

Sea Horse Optimization-Deep Neural Network: A Medication Adherence Monitoring System Based on Hand Gesture Recognition.

Sensors (Basel, Switzerland)
Medication adherence is an essential aspect of healthcare for patients and is important for achieving medical objectives. However, the lack of standard techniques for measuring adherence is a global concern, making it challenging to accurately monito...

Integration of Tracking, Re-Identification, and Gesture Recognition for Facilitating Human-Robot Interaction.

Sensors (Basel, Switzerland)
For successful human-robot collaboration, it is crucial to establish and sustain quality interaction between humans and robots, making it essential to facilitate human-robot interaction (HRI) effectively. The evolution of robot intelligence now enabl...

Combinatorial Bionic Hierarchical Flexible Strain Sensor for Sign Language Recognition with Machine Learning.

ACS applied materials & interfaces
Flexible strain sensors have been widely researched in fields such as smart wearables, human health monitoring, and biomedical applications. However, achieving a wide sensing range and high sensitivity of flexible strain sensors simultaneously remain...

Coupled Multimodal Emotional Feature Analysis Based on Broad-Deep Fusion Networks in Human-Robot Interaction.

IEEE transactions on neural networks and learning systems
A coupled multimodal emotional feature analysis (CMEFA) method based on broad-deep fusion networks, which divide multimodal emotion recognition into two layers, is proposed. First, facial emotional features and gesture emotional features are extracte...

Elbow Gesture Recognition with an Array of Inductive Sensors and Machine Learning.

Sensors (Basel, Switzerland)
This work presents a novel approach for elbow gesture recognition using an array of inductive sensors and a machine learning algorithm (MLA). This paper describes the design of the inductive sensor array integrated into a flexible and wearable sleeve...

Post-stroke hand gesture recognition via one-shot transfer learning using prototypical networks.

Journal of neuroengineering and rehabilitation
BACKGROUND: In-home rehabilitation systems are a promising, potential alternative to conventional therapy for stroke survivors. Unfortunately, physiological differences between participants and sensor displacement in wearable sensors pose a significa...

Real-Time Arabic Sign Language Recognition Using a Hybrid Deep Learning Model.

Sensors (Basel, Switzerland)
Sign language is an essential means of communication for individuals with hearing disabilities. However, there is a significant shortage of sign language interpreters in some languages, especially in Saudi Arabia. This shortage results in a large pro...

Breaking the silence: empowering the mute-deaf community through automatic sign language decoding.

Biomedizinische Technik. Biomedical engineering
OBJECTIVES: The objective of this study is to develop a system for automatic sign language recognition to improve the quality of life for the mute-deaf community in Egypt. The system aims to bridge the communication gap by identifying and converting ...