![]() ![]() However, Colaboratory notebooks are hosted in a short term virtual machine, with 2 vCPUs, 13GB memory, and a K80 GPU attached. I’ve created a notebook which lets you train your own network and generate text whenever you want with just a few clicks! Your First Text-Generating Neural Network #Ĭolaboratory is a notebook environment similar to Jupyter Notebooks used in other data science projects. Thanks to frameworks like TensorFlow and Keras, I built textgenrnn, a Python package which abstracts the process of creating and training such char-rnns to a few lines of code, with numerous model architecture and training improvements such as character embeddings, attention-weighted averaging, and a decaying learning rate.Ī neat benefit of textgenrnn is that it can be easily used to train neural networks on a GPU very quickly, for free using Google Colaboratory. It’s one approach, but there’s an opportunity for improvement with modern deep learning tooling. Many internet tutorials for text-generation neural networks simply copy an existing char-rnn implementation while changing the input dataset. As a result, a sufficiently trained network can theoretically reproduce its input source material, but since properly-trained neural networks aren’t perfect, the output can fall into a weird-but-good uncanny valley. Most popular approaches are based off of Andrej Karpathy’s char-rnn architecture/ blog post, which teaches a recurrent neural network to be able to predict the next character in a sequence based on the previous n characters. One of the more interesting applications of the neural network revolution is text generation. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |