Abstract: | Various authors have recently endorsed Harmonic Grammar (HG) as a replacement of Optimality Theory (OT). One argument for this move is based on computational considerations: OT looks prima facie like an exotic framework with no correspondent in Machine Learning, and the replacement with HG allows methods and results form Machine Learning to be imported within Computational Phonology; see for instance Potts et al. (2010), Pater (2009), Hayes & Wilson (2008), Coetzee & Pater (2008), Boersma & Pater (2007, 2008), Jesney & Tessier (2007, 2008), among others. This paper shows that this argument in favor of HG and against OT is wrong: I prove a simple, general result that says that algorithms for HG can be rather trivially adapted to OT. Thus, HG has no computational advantages over OT. This simple result has far reaching implications for Computational OT, as it allows classical methods and techniques from Machine Learning to be imported within Computational OT. I illustrate the fruitfulness of this new approach to Computational OT by showing that it leads to substantial progress in the theory of online algorithms for OT. In particular, I show that it leads to a convergence proof for a slight variant of Boersma's (1997) (non-stochastic) Gradual Learning Algorithm, based on convergence for the classical Perceptron Algorithm. |