Algoliterary Encounters: Difference between revisions
From Algolit
m |
m |
||
Line 15: | Line 15: | ||
=== What the Machine Writes: a closer look at the output === | === What the Machine Writes: a closer look at the output === | ||
+ | Two neural networks are presented more closely, what content do they produce? | ||
* [[CHARNN text generator]] | * [[CHARNN text generator]] | ||
* [[You shall know a word by the company it keeps]] | * [[You shall know a word by the company it keeps]] |
Revision as of 07:02, 2 November 2017
Algoliterary Encounters
Algoliterary works
A selection of works made by the members of Algolit over the past years.
- Oulipo recipes
- i-could-have-written-that
- The Weekly Address, A model for a politician
- In the company of CluebotNG
Algoliterary explorations
This chapter presents part of the research of Algolit over the past two years.
What the Machine Writes: a closer look at the output
Two neural networks are presented more closely, what content do they produce?
How the Machine Reads: Dissecting Neural Networks
Datasets
Common public datasets
Algoliterary datasets
From words to numbers
As machine learning is based on statistics and math, in order to process text, words need to be transformed to numbers. In the following section we present three technologies to do so.
- A Bag of Words
- A One Hot Vector
- Exploring Multidimensional Landscapes: Word Embeddings
- Word Embeddings Casestudy: Crowd embeddings
Different vizualisations of word embeddings
Inspecting the technique behind word embeddings
How a Machine Might Speak
If a neural net work could speak, what would it say?