What a nonsense subject, sorry. I generated that with a deep neural network that generates each character one by one built around a model called an LSTM (for long-short term memory). Generated text from these types of networks is still nonsense, but when done well they often capture much more structure in our language and text than previous ideas like Markov chain generation which is usually entirely word soup.
I trained it on all the previous Listserve emails that have been sent out, and ran it over night. I'm sure running it longer would have let me generate a full email, but we ran out of time.
Here are some of the intermediate steps which are interesting (note that I seeded the network with the word "hello")
Initially it just generated spaces:
> "hello "
Next it figured out a rather common english word:
> "hellon the the the the the the the the the the the"
After a bit it got a bit existential:
> ""hello to the past and i was all around me to make the world and i was so much to do this. i was all around me to make the world and i was so much to do this.""
Then it learned about Daft Punk (awesome):
> "hello to anyone else in the world around the world around the world around the world around the world"
After a long night of training on a GPU we ended up with the subject of this email, hope you enjoy it. I feel like the way it keeps getting stuck in loops lends credibility to sci-fi androids we've seen in movies and media up until now.
About me, I work at a healthcare analytics company. I study math and computer science, and still take classes at the local university. I'll probably share the code, I used to generate on my github @log0ymxm (you can also find me on twitter if you want to talk about math, computers, or healthcare).