|Abstract:||This paper describes an algorithm for learning constraint rankings, assuming access to the correct underlying forms, but not to the full correct structural descriptions of observed, overt forms. It contends with the problem of ambiguity of overt forms, where the overt information available to the learner supports more than one structural description, by entertaining separate ranking hypotheses for the different structural descriptions. Each possible structural description for the overt form is checked for consistency with the data previously seen by the learner, and inconsistent hypotheses are discarded. The checking is accomplished via an algorithm called Multi-Recursive Constraint Demotion, which is derived from the Recursive Constraint Demotion algorithm of Tesar & Smolensky (1994). This procedure is able to efficiently detect when a set of data are inconsistent. The result is a learning algorithm that is guaranteed to find a constraint ranking consistent with a set of overt forms whenever such a ranking exists. Simulation results of applying this learning algorithm to OT systems for metrical stress are presented and discussed. The simulation results provide the basis for an empirical argument that this learning procedure is not only far more efficient that brute-force enumeration of all possible constraint rankings, but may in fact be efficient enough to plausibly play a role in child language acquisition.