Generating Sequences With Recurrent Neural Networks

Alex Graves, Abdelrahman Mohamed, Geoffrey E. Hinton
2013
2 references

Abstract

This paper shows how Long Short-term Memory recurrent neural networks can be used to generate complex sequences with long-range structure, simply by predicting one data point at a time. The approach is demonstrated for text (where the data are discrete) and online handwriting (where the data are real-valued). It is then extended to handwriting synthesis by allowing the network to condition its predictions on a text sequence. The resulting system is able to generate highly realistic cursive handwriting in a wide variety of styles.

1 repository
1 reference

Code References

â–¶ pytorch/pytorch
1 file
â–¶ torch/optim/rmsprop.py
1
With Recurrent Neural Networks <https://arxiv.org/pdf/1308.0850v5.pdf>`_.
Link copied to clipboard!