Professors Elham Emami and Samira Rahimi organized and co-led an international interdisciplinary workshop in June 2023 at McGill University, built upon an intersectoral approach addressing equity, diversity and inclusion within the field of AI.
Knowledge Distillation (KD) is one of the widely known methods for model compression. In essence, KD trains a smaller student model based on a larger teacher model and tries to retain the teacher model's level of performance as much as possible. Howe...
BACKGROUND: Since the release of ChatGPT, numerous positive applications for this artificial intelligence (AI) tool in higher education have emerged. Faculty can reduce workload by implementing the use of AI. While course evaluations are a common too...