提交 5f78b7da 编写于 作者: Y ymcui

add tf2 model support

上级 f0f8b8b1
......@@ -27,7 +27,9 @@ Yiming Cui, Wanxiang Che, Ting Liu, Bing Qin, Ziqing Yang, Shijin Wang, Guoping
查看更多哈工大讯飞联合实验室(HFL)发布的资源:https://github.com/ymcui/HFL-Anthology
## 新闻
**2020/9/15 我们的论文["Revisiting Pre-Trained Models for Chinese Natural Language Processing"](https://arxiv.org/abs/2004.13922)被[Findings of EMNLP](https://2020.emnlp.org)录用为长文。**
**2021/1/27 所有模型已支持TensorFlow 2,请通过transformers库进行调用或下载。https://huggingface.co/hfl**
2020/9/15 我们的论文["Revisiting Pre-Trained Models for Chinese Natural Language Processing"](https://arxiv.org/abs/2004.13922)[Findings of EMNLP](https://2020.emnlp.org)录用为长文。
2020/8/27 哈工大讯飞联合实验室在通用自然语言理解评测GLUE中荣登榜首,查看[GLUE榜单](https://gluebenchmark.com/leaderboard)[新闻](http://dwz.date/ckrD)
......@@ -39,6 +41,8 @@ Yiming Cui, Wanxiang Che, Ting Liu, Bing Qin, Ziqing Yang, Shijin Wang, Guoping
2020/1/20 祝大家鼠年大吉,本次发布了RBT3、RBTL3(3层RoBERTa-wwm-ext-base/large),查看[小参数量模型](#小参数量模型)
<details>
<summary>历史新闻</summary>
2019/12/19 本目录发布的模型已接入[Huggingface-Transformers](https://github.com/huggingface/transformers),查看[快速加载](#快速加载)
2019/10/14 发布萝卜塔RoBERTa-wwm-ext-large模型,查看[中文模型下载](#中文模型下载)
......@@ -48,7 +52,7 @@ Yiming Cui, Wanxiang Che, Ting Liu, Bing Qin, Ziqing Yang, Shijin Wang, Guoping
2019/7/30 提供了在更大通用语料(5.4B词数)上训练的中文`BERT-wwm-ext`模型,查看[中文模型下载](#中文模型下载)
2019/6/20 初始版本,模型已可通过谷歌下载,国内云盘也已上传完毕,查看[中文模型下载](#中文模型下载)
</details>
## 内容导引
| 章节 | 描述 |
......
......@@ -19,7 +19,9 @@ More resources by HFL: https://github.com/ymcui/HFL-Anthology
## News
**2020/9/15 Our paper ["Revisiting Pre-Trained Models for Chinese Natural Language Processing"](https://arxiv.org/abs/2004.13922) is accepted to [Findings of EMNLP](https://2020.emnlp.org) as a long paper.**
**2021/1/27 All models support TensorFlow 2 now. Please use transformers library to access them or download from https://huggingface.co/hfl**
2020/9/15 Our paper ["Revisiting Pre-Trained Models for Chinese Natural Language Processing"](https://arxiv.org/abs/2004.13922) is accepted to [Findings of EMNLP](https://2020.emnlp.org) as a long paper.
2020/8/27 We are happy to announce that our model is on top of GLUE benchmark, check [leaderboard](https://gluebenchmark.com/leaderboard).
......@@ -29,6 +31,8 @@ More resources by HFL: https://github.com/ymcui/HFL-Anthology
2020/1/20 Happy Chinese New Year! We've released RBT3 and RBTL3 (3-layer RoBERTa-wwm-ext-base/large), check [Small Models](#Small-Models)
<details>
<summary>Past News</summary>
2019/12/19 The models in this repository now can be easily accessed through [Huggingface-Transformers](https://github.com/huggingface/transformers), check [Quick Load](#Quick-Load)
2019/10/14 We release `RoBERTa-wwm-ext-large`, check [Download](#Download)
......@@ -38,7 +42,7 @@ More resources by HFL: https://github.com/ymcui/HFL-Anthology
2019/7/30 We release `BERT-wwm-ext`, which was trained on larger data, check [Download](#Download)
2019/6/20 Initial version, pre-trained models could be downloaded through Google Drive, check [Download](#Download)
</details>
## Guide
| Section | Description |
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册