[Author Login]
Title:Computational Optimality Theory
Authors:Bruce Tesar
Comment:(Ph.D. Diss, 1995)
Abstract: Computational Optimality Theory

ROA-90 (121 pages)


Bruce Tesar

Center for Cognitive Science /

Department of Linguistics

Rutgers University


This is my Ph.D. thesis, just completed at the University

of Colorado.

In Optimality Theory, a linguistic input is assigned a grammatical

structural description by selecting, from an infinite set of candidate

structural descriptions, the description which best satisfies a ranked

set of universal constraints. Cross-linguistic variation is explained

as different rankings of the same universal constraints. Two questions

are of primary interest concerning the computational tractibility of

Optimality Theory. The first concerns the ability to compute optimal

structural descriptions. The second concerns the learnability of the

constraint rankings.

Parsing algorithms are presented for the computation of optimal forms,

using dynamic programming. These algorithms work for grammars in

Optimality Theory employing universal constraints which may be

evaluated on the basis of information local within the structural

description. This approach exploits optimal substructure to construct

the optimal description, rather than searching for the solution by

moving from one entire description to another.

A class of learning algorithms, the Constraint Demotion algorithms,

are presented, which solve the problem of learning constraint rankings

based upon hypothesized structural descriptions (an important

subproblem of the general problem of language learning). Constraint

Demotion exploits the implicit negative evidence available in the form

of the competing (suboptimal) structural descriptions of the input.

The data complexity of this algorithm is quadratic in the number of


Area/Keywords:Computation,Formal Analysis,Learnability
Article:Version 1