Skip to content

基于tensorflow2.x实现bert及其变体的预训练模型加载架构

License

Notifications You must be signed in to change notification settings

jerry1993-tech/xbert

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

xbert

Implementation of pre-training model loading architecture of bert and its variants with tensorflow2

Description

This is based on the Transformer architecture implemented by tf2.keras, which can quickly load the pre-trained bert model for downstream finetune training. So welcome to star and I will continue to update in the future.

Install

Temporary support:

pip install git+https://github.com/xuyingjie521/xbert.git

Features

Features that have been implemented so far:

  • Load pre-training weights of bert/roberta for finetune.
  • Support tf2.keras.

Pre-trained models be loaded