Inference Time of a CamemBERT Deep Learning Model for Sentiment Analysis of COVID Vaccines on Twitter.
Journal:
Studies in health technology and informatics
Published Date:
Jun 29, 2022
Abstract
In previous work, we implemented a deep learning model with CamemBERT and PyTorch, and built a microservices architecture using the TorchServe serving library. Without TorchServe, inference time was three times faster when the model was loaded once in memory compared when the model was loaded each time. The preloaded model without TorchServe presented comparable inference time with the TorchServe instance. However, using a PyTorch preloaded model in a web application without TorchServe would necessitate to implement functionalities already present in TorchServe.