Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
-
Updated
Jul 15, 2024 - Python
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
An Open-Source Framework for Prompt-Learning.
Language Understanding Evaluation benchmark for Chinese: datasets, baselines, pre-trained models,corpus and leaderboard
[CVPR 2021] Involution: Inverting the Inherence of Convolution for Visual Recognition, a brand new neural operator
Pre-trained Chinese ELECTRA(中文ELECTRA预训练模型)
[MICCAI 2019] [MEDIA 2020] Models Genesis
An Open-sourced Knowledgable Large Language Model Framework.
Official Repository for the Uni-Mol Series Methods
A work in progress to build out solutions in Rust for MLOPs
PyTorch code for "Prototypical Contrastive Learning of Unsupervised Representations"
[ICLR'23 Spotlight🔥] The first successful BERT/MAE-style pretraining on any convolutional network; Pytorch impl. of "Designing BERT for Convolutional Networks: Sparse and Hierarchical Masked Modeling"
Self-supervised contrastive learning for time series via time-frequency consistency
[ICLR 2024] Official PyTorch implementation of FasterViT: Fast Vision Transformers with Hierarchical Attention
Eden AI: simplify the use and deployment of AI technologies by providing a unique API that connects to the best possible AI engines
[ICML 2023] Official PyTorch implementation of Global Context Vision Transformers
[知识编辑] Must-read Papers on Knowledge Editing for Large Language Models.
Exploring Visual Prompts for Adapting Large-Scale Models
Code of the CVPR 2021 Oral paper: A Recurrent Vision-and-Language BERT for Navigation
PERT: Pre-training BERT with Permuted Language Model
Add a description, image, and links to the pre-trained-model topic page so that developers can more easily learn about it.
To associate your repository with the pre-trained-model topic, visit your repo's landing page and select "manage topics."