Advancements in Knowledge Distillation: Towards New Horizons of Intelligent Systems
Witold Pedrycz · Shyi-Ming Chen
јул 2023. · Studies in Computational Intelligence1100. књига · Springer Nature
Е-књига
232
Страница
Одломак
reportОцене и рецензије нису верификоване Сазнајте више
О овој е-књизи
The book provides a timely coverage of the paradigm of knowledge distillation—an efficient way of model compression. Knowledge distillation is positioned in a general setting of transfer learning, which effectively learns a lightweight student model from a large teacher model. The book covers a variety of training schemes, teacher–student architectures, and distillation algorithms. The book covers a wealth of topics including recent developments in vision and language learning, relational architectures, multi-task learning, and representative applications to image processing, computer vision, edge intelligence, and autonomous systems. The book is of relevance to a broad audience including researchers and practitioners active in the area of machine learning and pursuing fundamental and applied research in the area of advanced learning paradigms.
Можете да слушате аудио-књиге купљене на Google Play-у помоћу веб-прегледача на рачунару.
Е-читачи и други уређаји
Да бисте читали на уређајима које користе е-мастило, као што су Kobo е-читачи, треба да преузмете фајл и пренесете га на уређај. Пратите детаљна упутства из центра за помоћ да бисте пренели фајлове у подржане е-читаче.