https://openreview.net/forum?id=M5gEdo_gMY_&referrer=%5Bthe%20profile%20of%20ChienHung%20Liao%5D(%2Fprofile%3Fid%3D~ChienHung_Liao1)
Exploiting available medical records to train high performance computer-aided diagnosis (CAD) models via the semi-supervised learning (SSL) setting is emerging...
knowledge distillationadaptiveasymmetriclabelsharpening
https://arxiv.org/abs/2505.11897v1?spm=a2ty_o01.29997173.0.0.6720c921mLP3ON&file=2505.11897v1
Abstract page for arXiv paper 2505.11897v1: FiGKD: Fine-Grained Knowledge Distillation via High-Frequency Detail Transfer
fine grainedknowledge distillationhigh frequencyvia
https://arxiv.org/abs/2310.00096v1
Abstract page for arXiv paper 2310.00096v1: Towards Few-Call Model Stealing via Active Self-Paced Knowledge Distillation and Diffusion-Based Image Generation
towardscallmodelstealingvia
https://deepai.org/publication/a-study-on-knowledge-distillation-from-weak-teacher-for-scaling-up-pre-trained-language-models
05/26/23 - Distillation from Weak Teacher (DWT) is a method of transferring knowledge from a smaller, weaker teacher model to a larger studen...
knowledge distillationstudyweakteacherscaling
https://deepai.org/publication/x-3kd-knowledge-distillation-across-modalities-tasks-and-stages-for-multi-camera-3d-object-detection
03/03/23 - Recent advances in 3D object detection (3DOD) have obtained remarkably strong results for LiDAR-based models. In contrast, surroun...
knowledge distillationxacrossmodalitiestasks