Automatic Writing With Deep Learning: Progress [Video]

DZone 's Guide to

Automatic Writing With Deep Learning: Progress [Video]

We're starting to see more experiments with machine learning-driven news writing. This developer shares his model.

· Writers' Zone ·
Free Resource

This is a continuation of my first post at https://dmitrykan.blogspot.com/2018/05/automatic-writing-with-deep-learning.html.

Fast forward a few months (apologies for the delay), and I can share some findings.

Again, I think we should take AI co-writer exercises with a grain of salt. However, during this time, I have come across a few practical usage examples for such systems.

One potential usage area is the augmentation of a news article writer. More specifically, when writing a news item, one of the most challenging tasks is to coin a catchy title. Does the title have some trendy phrases in it? Or perhaps it mentions an emerging topic that captures attention at this given moment? Or reuses a pattern that worked well for this given author? Or just spurs an idea in the author's head?

In the following exercise, I have set a very modest goal: train a co-writer on previously written texts with an attempt to suggest something useful from them. I would imagine that this could be extended to texts that are trending, or a collection of particularly interesting titles — what have you.

To train such a model, I have used Robin Sloan's RNN writer: https://github.com/robinsloan/rnn-writer. The goodies of the project are:

  • Trained on Torch. Nowadays, Torch is leveraged via PyTorch, a deep learning Python library that is nearing its production readiness time.
  • The trained model gets exposed into an Atom — pluginable editor (I'd imagine that real writers would want to have the model integrated into their favorite text editor, like Word).
  • The API is available, too, to integrate into custom apps. (This is exactly how it is integrated with Atom.)

I will skip the installation of Torch and training the network and proceed to examples. The rnn-writer github repository has a good set of instructions to proceed with. I have installed Torch and trained the model on a Mac.

First things first: RNN trained on my master's thesis, "Design and Implementation of Peer-to-Peer Network" (University of Kuopio, 2007).

The text of the master's thesis is about 50 pages in English with diagrams and formulas. On one hand, having more data makes NNs learn more word representations, and should have larger probability space to predict the next word given the condition of the current word or phrase. On the other hand, limiting the input corpus to phrases that have a certain domain goal, like writing an email, could leverage a clean set of phrases that a user employs in many typical email passages.

As I got access to Fox articles, I thought this could warrant another RNN model and a test. Something to share next time.

automation, deep learning, machine learning, neural network, predictive text, rnn, writers zone, writing

Published at DZone with permission of Dmitry Kan , DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}