<OT> New Posting: ROA-980
roa at ruccs.rutgers.edu
roa at ruccs.rutgers.edu
Wed Jul 2 11:58:47 PDT 2008
ROA 980-0708
Some correct error-driven versions of the Constraint Demotion algorithm
Paul Boersma <paul.boersma at uva.nl>
Direct link: http://roa.rutgers.edu/view.php3?roa=980
Abstract:
This paper supersedes ROA 953: a proof of correct convergence
is now included. This paper shows that Error-Driven Constraint
Demotion (EDCD), an error-driven learning algorithm proposed
by Tesar (1995) for Prince and Smolensky's (1993) version
of Optimality Theory, can fail to converge to a correct
totally ranked hierarchy of constraints, unlike the earlier
non-error-driven learning algorithms proposed by Tesar and
Smolensky (1993). The cause of the problem is found in Tesar's
use of 'mark-pooling ties', indicating that EDCD can be
repaired by assuming Anttila's (1997) 'permuting ties' instead.
Proofs show, and simulations confirm, that totally ranked
hierarchies can indeed be found by both this repaired version
of EDCD and Boersma's (1998) Minimal Gradual Learning Algorithm.
Comments:
Keywords: learnability, variation
Areas: Learnability,Language Acquisition,Computation
Type: Remark or Reply
Direct link: http://roa.rutgers.edu/view.php3?roa=980
More information about the Optimal
mailing list