Phonological Acquisition in Optimality Theory: The Early Stages
To appear in Kager, Rene, Pater, Joe, and Zonneveld, Wim, (eds.), Fixing Priorities: Constraints in Phonological Acquisition. Cambridge University Press.
Recent experimental work indicates that by the age of ten months, infants have already learned a great deal about the phonotactics (legal sounds and sound sequences) of their language. This learning occurs before infants can utter words or apprehend most phonological alternations. I will show that this early learning stage can be straightforwardly modeled with Optimality Theory. Specifically, the Markedness and Faithfulness constraints can be ranked so as to characterize the phonotactics, even when no information about morphology or phonological alternations is yet available. I will also show how later on, the information acquired in infancy can help the child in coming to grips with the alternation pattern. I also propose a procedure for undoing the learning errors that are likely to occur at the earliest stages.
There are two specific formal proposals. One is a constraint ranking algorithm, based closely on Tesar and Smolensky's Constraint Demotion, which mimics the early, "phonotactics only" form of learning seen in infants. I illustrate the algorithm's effectiveness by having it learn the phonotactic pattern of a simplified language modeled on Korean. The other proposal is that there are three distinct default rankings for phonological constraints: low for ordinary Faithfulness (used in learning phonotactics); low for Faithfulness to adult forms (in the child's own production system); and high for output-to-output correspondence constraints.
Download (Preprint version of May 2001. For ROA-posted 1999 version, click here.)
For the purpose of these, see the text of the paper, Appendix A.
Click on the icons to view files that trace the actions of each learning algorithm
Group 1: Basic Comparison of Algorithms
See text of this paper, section 7.2
See Prince and Tesar (1999), pp. 19-22)
|Input files for the simulations (Excel format) **|
|Tableaux for correct grammar|
|Learned correctly by Low Faithfulness Constraint Demotion: simulation history|
|Learned incorrectly by Biased Constraint Demotion: simulation history|
|Learning correctly by Biased Constraint Demotion, supplemented with Favor Specificity: simulation history|
Group 2: Two Algorithms Take On Pseudo-Korean Armed with a "Crazy" Constraint (*Unaspirated)
See text of this paper, Appendix A
|Input file for the simulations (Excel format) **|
|Learned incorrectly by Low Faithfulness Constraint Demotion: simulation history|
|Learned correctly by Biased Constraint Demotion, supplemented with Favor Specificity: simulation history|
obtain free PDF reader]
**[Click here to obtain OTSoft, on which the simulations were run. Source code is included.]
Back to Bruce Hayes's Home Page