[Author Login]
Title:Does MaxEnt Overgenerate? Implicational Universals in Maximum Entropy Grammar
Authors:Arto Anttila, Giorgio Magri
Comment:to appear in: 'AMP 2017: Proceedings of the 2017 Annual Meeting on Phonology'; eds. Gallagher, Gillian, Gouskova, Maria, and Sora Yin; Washington, DC: Linguistic Society of America.
Abstract:A good linguistic theory should neither undergenerate (i.e., it should not miss any attested patterns) nor overgenerate (i.e., it should not predict any 'unattestable' patterns). We investigate the question of overgeneration in Maximum Entropy Grammar (ME) in the context of basic syllabification (Prince and Smolensky 2004) and obstruent voicing (Lombardi 1999), using the theory's T-order as a measure of typological strength. We find that ME has non-trivial T-orders, but compared to OT and HG, they are relatively sparse and sometimes linguistically counterintuitive. The fact that many reasonable implicational universals fail under ME suggests that the theory overgenerates, at least in the two phonological examples we examine. More generally, our results serve as a reminder that linguistic theories should be evaluated in terms of both descriptive fit and explanatory depth. A good theory succeeds on both fronts: we want a flexible theory that best fits the data, but we also want an informative theory that excludes unnatural patterns and derives the correct implicational universals.
Type:Paper/tech report
Area/Keywords:HG, stochastic HG, MaxEnt, T-orders, basic syllabification, voicing
Article:Version 1