AI Medical Compendium Topic

Explore the latest research on artificial intelligence and machine learning in medicine.

Distillation

Showing 1 to 10 of 14 articles

Clear Filters

Interpretable pairwise distillations for generative protein sequence models.

PLoS computational biology
Many different types of generative models for protein sequences have been proposed in literature. Their uses include the prediction of mutational effects, protein design and the prediction of structural properties. Neural network (NN) architectures h...

Inferior and Coordinate Distillation for Object Detectors.

Sensors (Basel, Switzerland)
Current distillation methods only distill between corresponding layers, and do not consider the knowledge contained in preceding layers. To solve this problem, we analyzed the guiding effect of the inferior features of a teacher model on the coordina...

Compressed gastric image generation based on soft-label dataset distillation for medical data sharing.

Computer methods and programs in biomedicine
BACKGROUND AND OBJECTIVE:   Sharing of medical data is required to enable the cross-agency flow of healthcare information and construct high-accuracy computer-aided diagnosis systems. However, the large sizes of medical datasets, the massive amount o...

Reliable Mutual Distillation for Medical Image Segmentation Under Imperfect Annotations.

IEEE transactions on medical imaging
Convolutional neural networks (CNNs) have made enormous progress in medical image segmentation. The learning of CNNs is dependent on a large amount of training data with fine annotations. The workload of data labeling can be significantly relieved vi...

Heterogeneous Collaborative Learning for Personalized Healthcare Analytics via Messenger Distillation.

IEEE journal of biomedical and health informatics
The Healthcare Internet-of-Things (IoT) framework aims to provide personalized medical services with edge devices. Due to the inevitable data sparsity on an individual device, cross-device collaboration is introduced to enhance the power of distribut...

Investigation of direct contact membrane distillation (DCMD) performance using CFD and machine learning approaches.

Chemosphere
Direct Contact Membrane Distillation (DCMD) is emerging as an effective method for water desalination, known for its efficiency and adaptability. This study delves into the performance of DCMD by integrating two powerful analytical tools: Computation...

Prototype-based sample-weighted distillation unified framework adapted to missing modality sentiment analysis.

Neural networks : the official journal of the International Neural Network Society
Missing modality sentiment analysis is a prevalent and challenging issue in real life. Furthermore, the heterogeneity of multimodality often leads to an imbalance in optimization when attempting to optimize the same objective across all modalities in...

Cross-modal knowledge distillation for continuous sign language recognition.

Neural networks : the official journal of the International Neural Network Society
Continuous Sign Language Recognition (CSLR) is a task which converts a sign language video into a gloss sequence. The existing deep learning based sign language recognition methods usually rely on large-scale training data and rich supervised informa...

Lightweight CNN combined with knowledge distillation for the accurate determination of black tea fermentation degree.

Food research international (Ottawa, Ont.)
Black tea is the second most common type of tea in China. Fermentation is one of the most critical processes in its production, and it affects the quality of the finished product, whether it is insufficient or excessive. At present, the determination...

Multiview attention networks for fine-grained watershed categorization via knowledge distillation.

PloS one
With the rapid development of artificial intelligence technology, an increasing number of village-related modeling problems have been addressed. However, first, the exploration of village-related watershed fine-grained classification problems, partic...