[2010.07515] RNNs can generate bounded hierarchical languages with optimal memoryopen searchopen navigation menucontact arXivsubscribe to arXiv mailings

Recurrent neural networks empirically generate natural language with high syntactic fidelity. However, their success is not well-understood theoretically. We provide theoretical insight into this success, proving in a finite-precision setting that RNNs can efficiently generate bounded hierarchical languages that reflect the scaffolding of natural language syntax. We introduce Dyck-($k$,$m$), the language of well-nested brackets (of $k$ types) and $m$-bounded nesting depth, reflecting the bounded memory needs and long-distance dependencies of natural language syntax. The best known results use $O(k^{\frac{m}{2}})$ memory (hidden units) to generate these languages. We prove that an RNN with $O(m \log k)$ hidden units suffices, an exponential reduction in memory, by an explicit construction. Finally, we show that no algorithm, even with unbounded computation, can suffice with $o(m \log k)$ hidden units.

2 mentions: @SuryaGanguli@su_9qu
Keywords: rnn
Date: 2020/10/17 23:21

Referring Tweets

@SuryaGanguli Our #emnlp2020 paper on how recurrent networks generated bounded hierarchical structure. t.co/i52PiqQvMn Congrats to @johnhewtt for being a main driving force behind this work! t.co/r18Va1ojdK
@su_9qu Manningのところが出してきた Dyck-(k,m)について 論文から 少し長くなりますが、以下に抜粋してみます。 t.co/521R26qvjd

Related Entries

Read more [1912.12834] Randomly Projected Additive Gaussian Processes for Regressioncontact arXivarXiv Twitter
0 users, 3 mentions 2020/01/03 02:21
Read more [2005.14050] Language (Technology) is Power: A Critical Survey of "Bias" in NLPopen searchopen navig...
0 users, 7 mentions 2020/05/29 15:51
Read more [2005.00770] Exploring and Predicting Transferability across NLP Tasksopen searchopen navigation men...
0 users, 3 mentions 2020/06/28 06:52
Read more [2009.10795] Dataset Cartography: Mapping and Diagnosing Datasets with Training Dynamicsopen searcho...
0 users, 6 mentions 2020/09/27 14:21
Read more [2010.02194] Self-training Improves Pre-training for Natural Language Understandingopen searchopen n...
0 users, 2 mentions 2020/10/15 15:52