Knowledge distillation implemented in TensorFlow
-
Updated
Aug 29, 2017 - Jupyter Notebook
Knowledge distillation implemented in TensorFlow
Modelling of distillation column from basic mass balance equations.
Learn about making a smaller network as good as a big ensemble model that can accelarate inference time.
model-compression-and-acceleration-4-DNN
Alternus Vera Project
✂️ Dataset Culling: Faster training of domain specific models with distillation ✂️ (IEEE ICIP 2019)
this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large
A program that uses a "bit banging" approach to control the entire process of distillation of various liquids
A PyTorch-based knowledge distillation toolkit for natural language processing
Deep Mutual Learning in PaddlePaddle
高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型
Distillation examples. Trying to make Speaker Recognition Faster through different Model Compression techniques
[ECCV 2020] Code release for "Resolution Switchable Networks for Runtime Efficient Image Recognition"
Implementation of several variations of the iCaRL incremental learning algorithm in PyTorch.
Efficient Crowd Counting via Structured Knowledge Transfer (SKT, ACM MM 2020)
Distillation and some other iterative methods for fastText.
Efficient Inference techniques implemented in PyTorch for computer vision.
Add a description, image, and links to the distillation topic page so that developers can more easily learn about it.
To associate your repository with the distillation topic, visit your repo's landing page and select "manage topics."