Kashgari is a production-level NLP Transfer learning framework built on top of tf.keras for text-labeling and text-classification, includes Word2Vec, BERT, and GPT2 Language Embedding.
-
Updated
Jul 9, 2021 - Python
Kashgari is a production-level NLP Transfer learning framework built on top of tf.keras for text-labeling and text-classification, includes Word2Vec, BERT, and GPT2 Language Embedding.
Entity and Relation Extraction Based on TensorFlow and BERT. 基于TensorFlow和BERT的管道式实体及关系抽取,2019语言与智能技术竞赛信息抽取任务解决方案。Schema based Knowledge Extraction, SKE 2019
[CVPR 2021] Official PyTorch implementation for Transformer Interpretability Beyond Attention Visualization, a novel method to visualize classifications by Transformer based networks.
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
End-to-End recipes for pre-training and fine-tuning BERT using Azure Machine Learning Service
Portuguese pre-trained BERT models
Trained models & code to predict toxic comments on all 3 Jigsaw Toxic Comment Challenges. Built using ⚡ Pytorch Lightning and 🤗 Transformers. For access to our API, please email us at [email protected].
Abstractive summarisation using Bert as encoder and Transformer Decoder
BERT-NER (nert-bert) with google bert https://github.com/google-research.
A Model for Natural Language Attack on Text Classification and Inference
BlueBERT, pre-trained on PubMed abstracts and clinical notes (MIMIC-III).
BERT, AWS RDS, AWS Forecast, EMR Spark Cluster, Hive, Serverless, Google Assistant + Raspberry Pi, Infrared, Google Cloud Platform Natural Language, Anomaly detection, Tensorflow, Mathematics
Multiple-Relations-Extraction-Only-Look-Once. Just look at the sentence once and extract the multiple pairs of entities and their corresponding relations. 端到端联合多关系抽取模型,可用于 http://lic2019.ccf.org.cn/kg 信息抽取。
Can we use explanations to improve hate speech models? Our paper accepted at AAAI 2021 tries to explore that question.
BETO - Spanish version of the BERT model
Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models
使用BERT模型做文本分类;面向工业用途
an easy-to-use interface to fine-tuned BERT models for computing semantic similarity in clinical and web text. that's it.
Add a description, image, and links to the bert-model topic page so that developers can more easily learn about it.
To associate your repository with the bert-model topic, visit your repo's landing page and select "manage topics."