Google Play icon

Algebra bridges syntax and meaning in natural language

Share
Posted July 29, 2014
Frobenius algebras are mathematical structures named after a German mathematician Georg Frobenius (1849-1917) and used in many branches of physics and mathematics. Image credit: Oberwolfach Photo Collection, from a photo album of the Mathematische Gesellschaft (Hamburg) via Wikimedia Commons.

Frobenius algebras are mathematical structures named after a German mathematician Georg Frobenius (1849-1917) and used in many branches of physics and mathematics. Image credit: Oberwolfach Photo Collection, from a photo album of the Mathematische Gesellschaft (Hamburg) via Wikimedia Commons.

Natural language has been an object of interest in numerous disciplines, ranging from philosophy to biology and – obviously – linguistics. The unanswered questions are abundant: what are the origins of language? Is language innate? How does something we call meaning attach to chunks of sounds or letters? And how does the way these chunks relate affect the message conveyed?

The latter question has been an object of intense study in various subfields of linguistics. It has been tagged as the principle of compositionality, referring to the idea that the meaning of a sentence is formed by separate meanings of the words it is composed of. Its modern formulation is due to Gottlob Frege, one of the founders of modern formal logic.

Recently the problem of how meanings compose has been increasingly studied by researchers working in the field of computational linguistics and categorial grammar in particular. The field is particularly fit for high-level composition analysis due to the tools that it has at its disposal to model language in a highly formalized/mathematical manner. Categorial grammar is a group of formalisms related to an abstract branch of mathematics called category theory, a handy tool for those who wish to relate different structures in a non-arbitrary manner.

An implicit question in research on compositionality is how the structure of language relates to its meaning. More precisely, how is grammar related to semantics? It is unsurprising that the problem persists, because grammar and semantics often seem to be two unrelated fields in linguistics. When it comes to reconciling the two, they are simply too distant.

A team of researchers – Dimitri Kartsaklis, Mehrnoosh Sadrzadeh, Stephen Pulman, and Bob Coecke of Oxford University – noted that the power of category theoretic unification of different mathematical structures, which has already yielded positive results in physics, could be of some use in linking grammar with semantics as well.

In terms of grammar, there have been elegant descriptions of deep sentence structures that correspond nicely to certain structures in abstract algebra. They have been placed under the title of type-logical grammars, referring to treatment of sentence components, such as nouns, verbs, adjectives, etc, as types. Given a few basic types and several simple rules describing the way they may be composed, one has an elegant computational description of natural language structure.

The problem is that such a description is far from being definitive, mostly because it is null in terms of word meaning. Each word of the same type is given the same interpretation, and therefore as far as type-logical grammar goes, there is no difference between ‘cat’ and ‘dog’, although there is a difference between ‘dog’ and ‘doggish’.

Correspondingly, there are elegant computational models for semantics, collected under the banner of distributional models of meaning, that take ideas from linear algebra to describe how word meanings change according to the context. The idea is to identify a context by listing out kernel words in some corpus like a text or a collection of texts.

These words correspond to the idea of a span in a vector space – they are a set of vectors that make up all the other vectors in the same space. The rest of the words in the corpus correspond to these other vectors that are formed from elements in the spanning set. Then, the distance of each of these “secondary” words from the kernel words is measured and the metric represents the context-sensitive meaning of the corpus words.

The problem, as mentioned above, is how to connect these two formalisms to provide a fuller and richer description of natural language. What Kartsaklis and others did was they used abstract algebra to give faithful mathematical characterizations of type-logical grammars on the one hand and vector space semantics on the other. As it turns out, the two are easily related via a group of algebras known as Frobenius algebras. The latter are already more than a hundred years old and have played a significant role in both mathematics and physics.

The connection allows one to join the descriptiveness of both formal grammar and formal semantics without simultaneously losing any content. In this way, for example, the meaning of different words is no longer reduced to their types (such as adjective, noun or verb) in case of type-logical grammar.

The overall formal account enables researchers to test models of how syntax is related to semantics. As an example, one can raise the following question: given a transitive verb phrase, such as “John writes letter”, which one is more important for the meaning of the transitive “writes” – subject (“John”) or object (“letter”). After testing different models on the British National Corpus, containing six million sentences and one million words, it turned out that those models which put more importance on the object of a transitive verb phrase are more semantically accurate than those that evaluate the subject higher.

More generally, the connection between syntax and semantics via Frobenius algebras opens the door for an inquiry as to how the meaning of a sentence is related to its structure.

Reference: arXiv: 1401.5980v1 [cs.CL] 23 Jan 2014

Featured news from related categories:

Technology Org App
Google Play icon
85,468 science & technology articles

Most Popular Articles

  1. New treatment may reverse celiac disease (October 22, 2019)
  2. The World's Energy Storage Powerhouse (November 1, 2019)
  3. "Helical Engine" Proposed by NASA Engineer could Reach 99% the Speed of Light. But could it, really? (October 17, 2019)
  4. Plastic waste may be headed for the microwave (October 18, 2019)
  5. Universe is a Sphere and Not Flat After All According to a New Research (November 7, 2019)

Follow us

Facebook   Twitter   Pinterest   Tumblr   RSS   Newsletter via Email