Brevitas: neural network quantization in PyTorch
-
Updated
Jul 16, 2024 - Python
Brevitas: neural network quantization in PyTorch
More readable and flexible yolov5 with more backbone(gcn, resnet, shufflenet, moblienet, efficientnet, hrnet, swin-transformer, etc) and (cbam,dcn and so on), and tensorrt
yolo model qat and deploy with deepstream&tensorrt
Model Compression Toolkit (MCT) is an open source project for neural network model optimization under efficient, constrained hardware. This project provides researchers, developers, and engineers advanced quantization and compression tools for deploying state-of-the-art neural networks.
针对pytorch模型的自动化模型结构分析和修改工具集,包含自动分析模型结构的模型压缩算法库
QAT(quantize aware training) for classification with MQBench
FakeQuantize with Learned Step Size(LSQ+) as Observer in PyTorch
This project enables Intel® platform technologies (SGX, QAT) and GPUs on Red Hat OpenShift Container Platform
quantization example for pqt & qat
Training U-Net based Convolutional Neural Network model to automatically identify and delineate areas of qat agriculture in Sentinel-2 multispectral imagery.
Build AI model to classify beverages for blind individuals
Official website of qat programming language...
Combidata is a flexible and powerful Python library designed for generating various combinations of test data based on defined cases and rules. It is especially useful for testing, debugging, and analyzing software applications and systems.
EfficientNetV2 (Efficientnetv2-b2) and quantization int8 and fp32 (QAT and PTQ) on CK+ dataset . fine-tuning, augmentation, solving imbalanced dataset, etc.
Quantization of Models : Post-Training Quantization(PTQ) and Quantize Aware Training(QAT)
Add a description, image, and links to the qat topic page so that developers can more easily learn about it.
To associate your repository with the qat topic, visit your repo's landing page and select "manage topics."