https://brave.com/research/kd/
Membership and Memorization in LLM Knowledge Distillation | Brave
The Brave browser is a fast, private and secure web browser for PC, Mac and mobile. Download now to enjoy a faster ad-free browsing experience that saves data...
knowledge distillationmembershipllmbrave
https://huggingface.co/docs/trl/gkd_trainer
Generalized Knowledge Distillation Trainer · Hugging Face
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
knowledge distillationhugging facetrainer
https://huggingface.co/papers/2004.09813
Paper page - Making Monolingual Sentence Embeddings Multilingual using Knowledge Distillation
Join the discussion on this paper page
knowledge distillationpapermakingsentenceembeddings
https://www.amazon.science/publications/knowledge-distillation-for-large-language-models-through-residual-learning
Knowledge distillation for large language models through residual learning - Amazon Science
Knowledge distillation has become a crucial technique to transfer the capacities of large language models (LLMs) to smaller, more efficient models for...
large language modelsknowledge distillationamazon sciencelearning