Here are
13 public repositories
matching this topic...
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
Updated
Jun 3, 2025
Python
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Updated
May 9, 2024
Python
pytorch实现 Bert 做seq2seq任务,使用unilm方案,现在也可以做自动摘要,文本分类,情感分析,NER,词性标注等任务,支持t5模型,支持GPT2进行文章续写。
Updated
Jun 18, 2022
Python
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
Updated
Aug 4, 2024
Python
Unilm for Chinese Chitchat Robot.基于Unilm模型的夸夸式闲聊机器人项目。
Updated
Jan 21, 2021
Python
天池阿里灵杰问天引擎电商搜索算法赛非官方 baseline,又名 NLP 从入门到 22/2771。
Updated
Jun 29, 2022
Python
code for the paper "UniKeyphrase: A Unified Extraction and Generation Framework for Keyphrase Prediction"
Updated
Oct 3, 2024
Python
Updated
Jan 10, 2024
Python
利用BERT预训练模型进行文本生成,可用于对话、摘要、问题生成等任务。 目前支持策略,词表的插入和删除、自定义Character Embedding、随机词替换等
Updated
Jun 1, 2022
Python
Updated
Dec 4, 2019
Python
Using Unilm-Chinese to generate Chinese couplets. 使用unilm中文版来生成中文对联。
Updated
Feb 12, 2022
Python
Updated
Jun 14, 2020
Jupyter Notebook
Code
<
4A3F
a id="issues-tab-990705188" href="/wink-snow/BERT-UNILM-for-Poetry-Gen/issues" data-hydro-click="{"event_type":"explore.click","payload":{"click_context":"REPOSITORY_CARD","click_target":"REPOSITORY_ISSUES","click_visual_representation":"ISSUES_TAB","actor_id":null,"record_id":990705188,"originating_url":"https://github.com/topics/unilm","user_id":null}}" data-hydro-click-hmac="e03016928ffcc7cff3a62ec731481aff8c05a64a255e809d27bd43f21be9ea74" data-turbo="false" data-view-component="true" class="tabnav-tab f6 px-2 py-1">
Issues
Pull requests
A project using BERT-UNILM to generate Chinese poems.
Updated
May 28, 2025
Python
Improve this page
Add a description, image, and links to the
unilm
topic page so that developers can more easily learn about it.
Curate this topic
Add this topic to your repo
To associate your repository with the
unilm
topic, visit your repo's landing page and select "manage topics."
Learn more
You can’t perform that action at this time.