Generation, Recognition, and Learning
in Finite State Optimality Theory
Jason Riggle – 2004
Welcome to my dissertation beta-tester page. I’m
currently circulating the manuscript and code below in hopes of getting some
feedback before I release this work to the world at large at the end of August.
Please feel free to read the dissertation without playing with the prolog
program or to play with the code without reading the dissertation. Any and all
comments on the dissertation, the code, or anything else will be greatly
appreciated. If you would like to send me comments you can send them by email
or just write all over the pdf and send that to me. If you have any trouble
getting the program running, email me and I’ll walk you through it.
The
Dissertation – (beta version)
pdf Generation,
Recognition, and Learning in Finite State Optimality Theory – One-up
pdf
Generation, Recognition, and Learning in Finite State Optimality Theory –
Two-up, with only a 24% reduction in font size (nice for printing)
The
Code – change the file suffixes from .txt to .pl once you’ve down
loaded these
pl This is the
Prolog code for the algorithms
– you’ll need Prolog, Ghostview, and Graphviz
installed to run it properly, and don’t forget to change the file suffixes to
.pl
pl This is Eval for
the ranking Onset >> NoCoda >> Dep >> Max
– you don’t need this because you can build your own EVAL.pl, but my code looks for this file when it is compiled.
Prolog, Graphviz, and Ghostview
X Ghostview For
Windows you’ll need to add Ghostview to your path so that it can be called from
the command line. Go to: My Computer/Properties/Advanced/Environment Variables,
Highlight Path and then click on Edit and add “
C:\PROGRA~1\Ghostgum\gsview; ” to your path and then restart your machine.
– if you have
any trouble send me an email