My Interests
I work on computational and mathematical linguistics as well as
logic. This web page might give you a more specific idea of what
I have been doing so far. It is not meant to state exclusive areas
of interests.
Computational and Mathematical Linguistics
I consider myself a theoretical computational linguist. Just like an
ordinary theoretical computer scientist I rarely use the
computer to get work done. Rather, my interest is in understanding
how the computer can be used to achieve something. This means
that I ask the following questions:
- How can the problem be adequately formalized?
- Can a problem in principle be solved and if so, how?
- What is the most efficient way to solve a problem?
Much work in computational linguistics is or practical nature, using
certain methods or experimenting with them. There is however as always
the need to understand exactly what these methods yield and what we
use them for in order to properly assess the merits of the practical
research that is otherwise being done. Also, a lot of theoretical
work in linguistics is not in a form that allows immediate implementation.
Thus, a rigorous formalization is needed before one can proceed to the
implementation. This is the work that I do (just like
Ed Stabler).
Some of the topics mentioned below as general linguistics are also
treated mathematically, but I will mention here only things that
are immediately related to these questions.
Formalisation of Generative Grammar
Originally, I have been interested in formal syntax, in particular
formal properties of Government and Binding Theory, though I have
concerned myself also with the foundations of GPSG and HPSG. In a
series of three papers I dealt with the main innovative tools of
transformational grammar: the use of
command relations, the use of
adjunction, and the use of
chains. Apart from theoretical interests
(for example, showing that
multidominance structures are practically equivalent to the more
common trace chain structures that you see in the textbooks), there is
another motive behind the research. Generative grammar is a complicated
theory, and it is often very difficult to gauge the consequences of
particular assumptions. Ultimately, we want to be assured that
everything works as promised. Such an assurance may come in the form
of a theorem prover where you can enter a principle or grammatical
fact and you are told whether or not it holds. Such a prover has
been written in Prolog by
Ed Stabler.
The unique trait about my own approach is that I use dynamic logic rather
than predicate logic or monadic second order logic. I have recently
establish that the dynamic logic of the multidominance structures is
decidable (see here). This is not all that
we need, but I am slowly getting there.
The hierarchy or hierarchies of languages have played an essential
role in proofs of adequacy of formal models (witness the results by
Peters and Ritchie or the history of GPSG). My own interest is in
knowing where to place the limits
either way. One result was the proof that GB implies strong
context freeness if head movement is bounded (see
Syntactic Codes and Grammar Refinement).
This result was independently established by
James Rogers.
I also got interested in Linear Context Free Rewrite Systems.
It is known (thanks to
Jens Michaelis
and Henk Harkema) that
the Minimalist Program is weakly equivalent to these grammars.
Together with Jens Michaelis I had earlier established that
languages with stacked cases are not semilinear, and thus cannot
be generated by these grammars (and therefore the Minimalist
Program). For the paper see here.
A summary of case marking and semilinearity can be found
here. Semilinearity has been
a hobby since then. I managed to establish
a new proof of a theorem by Spanier and Ginsburg (see
A New Proof of a Result by Ginsburg and Spanier).
I get a lot of requests for it but it is already included in
"The Mathematics of Language". Another nice result is the
proof that Ogden's Lemma (also known as the Pumping Lemma)
is vastly insufficient to characterise context free languages;
not even a strengthening suggested by Alexis Manaster-Ramer is
enough (see here).
Implementation
Teaching programming involves writing stuff myself. I am doing
OCaML. I have a tool for manipulating finite state automata.
I expect to do more when I teach it again.
Also, I have implemented the ideas on referent systems
exposed below. This will eventually
allow to test the theory and run it on large scale dictionaries.
This work is still experimental. If you want to obtain the
source code, please email me.
Logic and Grammar
There is a research area called model theoretic syntax
into which these interests partly fit. One half of the enterprise
has been explained above, namely the analysis of theories (HPSG,
Generative Grammar) with the help of a logical language (in my
case dynamic logic). Part of it has to do with obtaining
decidability proofs. There are also deeper connections to be
uncovered. In Inessential Features
I have introduced the notion of an inessential feature. This is
a feature whose distribution is controlled by the other
features. It is, in logical terms, implicitly defined.
If one can obtain an explicit definition (not always
guaranteed!) then the feature can be eliminated from the
grammar. This was meant to explain the divergence between
HPSG, where every detail is in the representation, and
generative grammar where "inessential" stuff is omitted.
Needless to say that linguists will not necessarily see
things the way I have explained, but if one is interested
in seeing the common features behind the grammar formalisms,
an analysis of this sort is called for.
General Linguistics
Agreement Morphology and Case
Recently, I have become (again) more interested in semantics. For the most
part I studied the interaction between morphology and semantics. In the
monograph Agreement Morphology, Argument Structure and Syntax, I have spelled
out in detail a theory that maps morphological strings onto semantical
structures and back. This work has shown to be just the beginning
of an investigation that leads to a reassessment of the role of
morphological elements, most notably cases. A sideline of this
research is the particular structure of local cases. One particular
conclusion is that local cases consist of several layers (see
Against the Feature Bundle Theorey of Case).
Ordinary case selection is selection of two heads, not one;
the case where just one head is selected has not hitherto been
identified, but does occur quite frequently (see
Directionality Selection).
Compositionality
Another issue that has concerned me recently is the notion of
compositionality. My main intuition about compositionality is that
present theories do not contain a notion of part of an expression.
I have published on this in
Strict Compositionality and Literal Movement Grammars.
I have made the notion of compositionality a central concern of the book
"Mathematics of Language". While in that book I studied mainstream
theories, such as Montague Grammar, in depth, I have recently moved
to investigate different approaches to the question. Suppose, for
example, that meanings are not typed λ-terms but rather truth
conditions. Suppose also that while meanings contain to indication
of syntax and syntactic representations nothing of value for semantics.
Then the thesis of compositionality becomes highly nontrivial; the
conversion to Chomsky Normal Form does not any longer come for free.
Now we can actually show that nontrivial syntactic facts arise out
of these assumptions (that Dutch is not strongly context free, for
example). If you are interested, a preliminary version is found
here).
The Language of Space
The language of space is a small area compared to the language of
time. Tense and aspect is a popular topic in linguistics, while space
is often only considered in its function as a metaphor for something
else (see the work of Langacker). I got interested in space both
because of my interest in Uralic languages (which have lots of local
cases) and because of the fact that space as a formal object is well
studied and so we can actually say with enough precision what certain
words mean. Phrases such as "into the city", "200 m from the wall"
and so on have reasonably precise meaning. I have been looking at a
wide range of languages (Indo-European, Uralic, Oceanic, Native
American) and looked at the morphological and syntactic patterns for
expressions of space. A reader is on the website, but it is in a
very rough state right now.
Mathematical Logic
The Lattices of Modal Logics
I have been concerned mostly with the theory of modal propositional
logic, but almost every issue that falls under that heading. My
initial research was centered around the lattice of modal logics,
in particular splittings of that lattice. Later I have worked on
the relationship between monomodal and polymodal logic. This work
was done in close collaboration with
Frank Wolter.
The basic result is that the lattice of n-modal logics is
for any given number n isomorphic to some interval in the
lattice of monomodal logics, and this isomophism is faithful to
almost any property of modal logics. This is a very effective tool
in producing counterexamples. My Book "Tools and Techniques in
Modal Logic" (Elsevier, 1999) contains a pretty good picture of
my research and research interests in modal logic.
Reduction Functions
Consider the problem whether a formula φ follows from a
set of premises Γ in a logic M containing L.
It is then known that there is a finite set H, that depends on
both A and Γ, such that φ follows
from Γ;Δ in L. Suppose now that there
is an algorithm to compute Δ from φ and
Γ. Then if L is decidable, so is M.
Moreover, explicit complexity bounds can be given. If L
has the finite model property, then M does, too. Finally,
there is a criterium (easy to verify) which allows to deduce that
M has interpolation if only L does. I have introduced
the method in "Tools and Techniques" and later refined it. It has
great potential (the best known space bounds can be derived with
its help). Moreover, it is finitistic and so it does away with all
the infinite models that make introductory textbooks so heavy.
Modal Predicate Logic
Modal predicate logic is of interest to linguists and philosophers
and they have contributed the most to the subject, both in the form
of puzzles (the Pierre Puzzle) and in the form of proposals for
semantics. The most popular brands are Kripke-semantics and
counterpart semantics. Both are known to be highly incomplete.
That means in practice that even for systems whose propositional
part is Kripke-complete one may not be able to provide semantics
of the mentioned kind. Valentin Shehtman and Dimiter Skvortsov
have given the first generally complete semantics. Their semantics
is based ob hyperdoctrines and is quite uninituitive. Together with
Oliver Kutz I have worked on
general semantics, first for counterpart logics and then for modal
predicate logics (first and second order). We have proved a general
completeness theorem and showed how to systematically inperpret this
semantics in other semantics.
Marcus Kracht
Last Modified: Wed Feb 14 15:00:00 2005