[1308.0850] Generating Sequences With Recurrent Neural Networks

This paper shows how Long Short-term Memory recurrent neural networks can be used to generate complex sequences with long-range structure, simply by predicting one data point at a time. The approach is demonstrated for text (where the data are discrete) and online handwriting (where the data are real-valued). It is then extended to handwriting synthesis by allowing the network to condition its predictions on a text sequence. The resulting system is able to generate highly realistic cursive handwriting in a wide variety of styles.

1 mentions: @Ronmemo1
Date: 2019/10/23 11:18

Referring Tweets

@Ronmemo1 ABST100メモ:78 RMSpropGraves(2013)→t.co/JzIlWpoz3q ,LSTMによる手書き合成に利用されるもの

Bookmark Comments

Related Entries

Read more [1508.06576] A Neural Algorithm of Artistic Style
33 users, 1 mentions 2019/09/08 14:18
Read more [1611.10012] Speed/accuracy trade-offs for modern convolutional object detectorscontact arXivarXiv T...
7 users, 1 mentions 2020/02/15 05:21
Read more Annotationツール比較 - Qiita
16 users, 0 mentions 2020/06/12 03:51
Read more [1512.00567] Rethinking the Inception Architecture for Computer Visionopen searchopen navigation men...
10 users, 1 mentions 2020/07/04 05:21
Read more GitHub - rlabbe/Kalman-and-Bayesian-Filters-in-Python: Kalman Filter book using Jupyter Notebook. Fo...
8 users, 0 mentions 2020/07/16 06:52