[Author Login]
Title:Cognitive biases, linguistic universals, and constraint-based grammar learning
Authors:Jennifer Culbertson, Paul Smolensky, Colin Wilson
Comment:To appear in TopiCS in Cognitive Science, issue on computational psycholinguistics, J. Hale & D. Reitter, Eds.
Length:25 pp.
Abstract:According to classical arguments, language learning is both facilitated and constrained by cognitive biases. These biases are reflected in linguistic typology -- the distribution of linguistic patterns across the world's languages -- and can be probed with artificial grammar experiments on child and adult learners. Beginning with a widely successful approach to typology (Optimality Theory), and adapting techniques from computational approaches to statistical learning, we develop a Bayesian model of cognitive biases and show that it accounts for the detailed pattern of results of artificial grammar experiments on noun-phrase word order (Culbertson, Smolensky & Legendre 2012). Our proposal has several novel properties that distinguish it from prior work in the domains of linguistic theory, computational cognitive science, and machine learning. The paper illustrates how ideas from these domains can be synthesized into a model of language learning in which biases range in strength from hard (absolute) to soft (statistical), and in which language-specific and domain-general biases combine to account for data from the macro-level scale of typological distribution to the micro-level scale of learning by individuals.
Type:Paper/tech report
Area/Keywords:Learning biases, universals, acquisition, artificial language learning
Article:Version 1