|Title:||Learning to be insensitive to weight in Pintupi|
|Comment:||... are welcome|
|Abstract:||It is usually taken for granted that normally developing children acquiring one and the same language end up with one and the same grammar (e.g. Chomsky & Halle 1968:251). The language-acquiring child is supposed to be capable of creating the adult grammar from the information provided in the speech stream, despite the fact that this information may be incomplete in terms of possible ambiguities and gaps in the data they are exposed to (known
as the poverty of the stimulus problem; e.g. Chomsky 1986:7).
In the computer simulations of acquisition here it is shown that final grammars of virtual learners can differ even though they learned from the same data and have the same output than given in the training data. This is demonstrated by modelling word stress of Pintupi, a language spoken in Western Australia (Hansen & Hansen 1969). The grammatical framework is Optimality Theory (Prince & Smolensky 1993); the learning algorithms of the computer simulations are Error Driven Constraint Demotion (Tesar 1995) and the Gradual Learning Algorithm (Boersma 1997).