Summary of the tasks — transformers 4.1.1 documentation

This page shows the most frequent use-cases when using the library. The models available allow for many different configurations and a great versatility in use-cases. The most simple ones are presented here, showcasing usage for tasks such as question answering, sequence classification, named entity recognition and others. These examples leverage auto-models, which are classes that will instantiate a model according to a given checkpoint, automatically selecting the correct model architectur

1 mentions: @kazmuzik
Keywords: transformer
Date: 2021/01/02 20:21

Referring Tweets

@kazmuzik Summary of tasksのText Generation にある2つのexampleに関しては、gpt2-xl でも11GBのRTX 2080 Tiで実行出来た。CPU(EPYC 7571)では23秒で、RTX 2080 Tiでは2.75秒と1桁速い。

Related Entries

Read more 分析AI「YOSHINA」で縦割り110番のデータ分析をしてみました - Retrieva OFFICIAL BLOG
0 users, 1 mentions 2020/11/30 06:00
Read more Google AI Blog: Google at NeurIPS 2020
0 users, 3 mentions 2020/12/08 18:52
Read more Transformer and seq2seq model for Paraphrase Generation - ACL Anthology
1 users, 0 mentions 2020/12/11 05:21
Read more fairseq/examples/wmt20 at master · pytorch/fairseq · GitHub
0 users, 0 mentions 2020/12/11 20:21
Read more PyTorchでクロスバリデーション - Qiita
1 users, 1 mentions 2021/01/05 14:22