Quick notes on computational models of meaning

Part of Human Scale Natural Language Processing.

(Notes are a work in progress, sorry thanks!)

Broadly, there are three ways of representing the meaning of words (“lexical semantics”) computationally:

You can represent a graph as a matrix, so ultimately all of these techniques represent the meaning of a word as a sequence of numbers (i.e., a vector). For this reason, we can in practice use a lot of the same computational/mathematical tricks for working with data derived from any of these sources.