Abstract: | Optimality-Theoretic learning algorithms are only guaranteed to be successful if the data fed to them contain full structural descriptions of the surface forms, i.e. descriptions that include hidden structure like metrical feet. This is not realistic as a model of acquisition, because children are only exposed to overt forms, e.g. unstructured strings of syllables. Optimality-Theoretic learning algorithms that learn solely from overt forms turn out to sometimes succeed and sometimes fail (Tesar & Smolensky 2000). This possibility of failure is a property of both on-line learning algorithms that have been proposed for OT, namely Error Driven Constraint Demotion (EDCD; Tesar 1995) and the Gradual Learning Algorithm (GLA; Boersma 1997). The possibility of failure is not necessarily bad: one would want an algorithm to fail for languages that do not exist, and to succeed for languages that do exist. Latin exists (or existed). This paper compares the performance of the two learning algorithms for the metrical stress system of Classical Latin. It turns out that EDCD cannot learn this system from overt forms only, and that the GLA can. This suggests that the GLA may be a better model of acquisition than EDCD. The results also provide evidence in the discussion in the literature about what is the correct linguistic analysis of Latin stress: if overt forms contain main stress only, the GLA makes the child posit an analysis that makes use of uneven trochees (like the analysis by Jacobs 2000) rather than strictly bimoraic trochees (like the analysis by Mester 1994 and Hayes 1995). |