Close this search box.

Martins A.F.T., Smith N.A., Aguiar P.M.Q., Figueiredo M.A.T.

EMNLP 2011 - Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference

pp 238



Dual decomposition has been recently proposed as a way of combining complementary models, with a boost in predictive power. However, in cases where lightweight decompositions are not readily available (e.g., due to the presence of rich features or logical constraints), the original subgradient algorithm is inefficient. We sidestep that difficulty by adopting an augmented Lagrangian method
that accelerates model consensus by regularizing towards the averaged votes. We show how first-order logical constraints can be handled efficiently, even though the corresponding subproblems are no longer combinatorial, and report experiments in dependency parsing, with state-of-the-art results.