Toward explainable AI-empowered cognitive health assessment.
Journal:
Frontiers in public health
Published Date:
Mar 9, 2023
Abstract
Explainable artificial intelligence (XAI) is of paramount importance to various domains, including healthcare, fitness, skill assessment, and personal assistants, to understand and explain the decision-making process of the artificial intelligence (AI) model. Smart homes embedded with smart devices and sensors enabled many context-aware applications to recognize physical activities. This study presents , a novel XAI-empowered human activity recognition (HAR) approach based on key features identified from the data collected from sensors located at different places in a smart home. identifies a set of new features (i.e., the total number of sensors used in a specific activity), as based on weighting criteria. Next, it presents (i.e., mean, standard deviation) to handle the outliers and higher class variance. The proposed is evaluated using machine learning models, namely, random forest (RF), K-nearest neighbor (KNN), support vector machine (SVM), decision tree (DT), naive Bayes (NB) and deep learning models such as deep neural network (DNN), convolution neural network (CNN), and CNN-based long short-term memory (CNN-LSTM). Experiments demonstrate the superior performance of using RF classifier over all other machine learning and deep learning models. For explainability, uses Local Interpretable Model Agnostic (LIME) with an RF classifier. achieves 0.96% of F-score for health and dementia classification and 0.95 and 0.97% for activity recognition of dementia and healthy individuals, respectively.