This repository contains the Romanian version of DistilBERT.
-
Updated
Dec 24, 2021 - Jupyter Notebook
This repository contains the Romanian version of DistilBERT.
A template for use in creating Autodistill Target Model packages.
This is a fork of the distilling-step-by-step repository with the aim of creating a task-specific LLM distillation framework for healthcare.
A tutorial on how to prune the embedding layer of a language model and crafting a suitable tokenizer
Prompt engineering for developers
summer internship project @ JetBrains Research
Alternus Vera Project
Learn about making a smaller network as good as a big ensemble model that can accelarate inference time.
Distillation of GANs with fairness constraints
A PyTorch-based knowledge distillation toolkit for natural language processing
This is an implementation for paper Automated training of location-specific edge models for traffic counting
An entrance test for a Computer Vision / NLP researcher job
A Series on Optimizing Transformer-Based Models
Efficient Inference techniques implemented in PyTorch for computer vision.
Distillation examples. Trying to make Speaker Recognition Faster through different Model Compression techniques
Deep Mutual Learning in PaddlePaddle
A program that uses a "bit banging" approach to control the entire process of distillation of various liquids
Optimising train, inference and throughput of expensive ML models
【NCA】Learning Metric Space with Distillation for Large-Scale Multi-Label Text Classification
Add a description, image, and links to the distillation topic page so that developers can more easily learn about it.
To associate your repository with the distillation topic, visit your repo's landing page and select "manage topics."