AdapterHub - 214 adapters for 30 text tasks and 32 languages

Inference Training Load an Adapter for Inference 🏄 Loading existing adapters from our repository is as simple as adding one additional line of code: model = AutoModelForSequenceClassification.from_pretrained("bert-base-uncased")model.load_adapter("sentiment/sst-2@ukp") The SST adapter is light-weight: it is only 3MB! At the same time, it achieves results that are on-par with fully fine-tuned BERT. We can now leverage SST adapter to predict the sentiment of sentences: tokenizer = AutoToke

Date: 2020/12/20 05:21

Related Entries

Read more Interactive Pivot Tables in Jupyter Notebook | by Satyam Kumar | Oct, 2020 | Towards Data Science
0 users, 0 mentions 2020/10/11 08:21
Read more KuroNet: Regularized Residual U-Nets for End-to-End Kuzushiji Character Recognition | SpringerLink
0 users, 0 mentions 2020/11/09 02:21
Read more [2012.02338] An optimal quantum sampling regression algorithm for variational eigensolving in the lo...
0 users, 1 mentions 2020/12/14 14:21
Read more Japan.R 2020 - YouTube
0 users, 0 mentions 2020/12/19 11:21
Read more Nested Variational Inference | OpenReview
0 users, 1 mentions 2021/03/04 15:52