[1701.06538] Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layeropen searchopen navigation menucontact arXivsubscribe to arXiv mailings

The capacity of a neural network to absorb information is limited by its number of parameters. Conditional computation, where parts of the network are active on a per-example basis, has been proposed in theory as a way of dramatically increasing model capacity without a proportional increase in computation. In practice, however, there are significant algorithmic and performance challenges. In this work, we address these challenges and finally realize the promise of conditional computation, achieving greater than 1000x improvements in model capacity with only minor losses in computational efficiency on modern GPU clusters. We introduce a Sparsely-Gated Mixture-of-Experts layer (MoE), consisting of up to thousands of feed-forward sub-networks. A trainable gating network determines a sparse combination of these experts to use for each example. We apply the MoE to the tasks of language modeling and machine translation, where model capacity is critical for absorbing the vast quantities of k

2 mentions: @JeffDean
Date: 2020/07/01 03:51

Referring Tweets

@JeffDean Worth pointing out that, like original sparsely-gated mixture of experts work (t.co/xqFLBMRWo4), this work also sees that training time is reduced by ~10X over a large dense model while simultaneously having much higher quality (36.9 BLEU for dense vs. 44.3 for sparse).

Bookmark Comments

Related Entries

Read more Magenta
31 users, 7 mentions 2019/06/12 23:16
Read more [1803.01271] An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Mo...
13 users, 2 mentions 2020/02/27 23:20
Read more GitHub - microsoft/VoTT: Visual Object Tagging Tool: An electron app for building end to end Object ...
14 users, 0 mentions 2020/03/15 14:21
Read more [1609.04468] Sampling Generative Networksopen searchopen navigation menucontact arXivarXiv Twitter
2 users, 0 mentions 2020/04/28 15:50
Read more Unsupervised Learning of Depth and Ego-Motion from Video
2 users, 1 mentions 2020/05/04 00:52