Serving Google BERT in Production using Tensorflow and ZeroMQ · Han Xiao Tech Blog - Deep Learning, NLP, AI

Serving Google BERT in Production using Tensorflow and ZeroMQ · Han Xiao Tech Blog - Deep Learning, NLP, AI

When we look back at 2018, one of the biggest news in the world of ML and NLP is Google’s Bidirectional Encoder Representations from Transformers, aka ... · Han Xiao

2 mentions: @helmetti
Date: 2019/02/23 14:17

Referring Tweets

@helmetti 2年ぶりのNLP分野のブレークスルー、BERTをサービングする戦略。ベクタライズするのが速ければリアルタイムでlatentな表現を獲得できる。というアイデアも一過的なものでスーパー速いハードウェアで今年後半頃には雑にやっても誰でもできるようになってたらいいな。t.co/n7hibic1SU

Related Entries

Read more GitHub - soskek/bert-chainer: Chainer implementation of "BERT: Pre-training of Deep Bidirectional Tr...
7 users, 0 mentions 2018/12/02 18:01
Read more [DL Hacks]BERT: Pre-training of Deep Bidirectional Transformers for L…
4 users, 5 mentions 2018/12/07 04:31
Read more Deep Learning for NLP Best Practices
97 users, 0 mentions 2018/04/22 03:40
Read more The major advancements in Deep Learning in 2016 | Tryolabs Blog
0 users, 0 mentions 2018/04/22 03:40
Read more Deep Learningを用いた教師なし画像検査の論文調査 GAN/SVM/Autoencoderとか .pdf
0 users, 0 mentions 2018/10/09 06:53