Lexicalist grammars were a hot topic in the late 80s and early-to-mid 90s. I came up through educational institutions that Mark Steedman had just left or was just about to join (in one case, Edinburgh, it was both). The Edinburgh lexicalist tendency got me a job at Sharp, and the paper I wrote on Stochastic HPSG probably helped get my Ohio State job. Julia Hockenmaier, who later worked with Mark on CCGBank, was my advisee during her M.Sc (not CCG-related) and in the first year of her Ph.D. David Tugwell was my first Ph.D advisee, and wrote a dissertation that is obviously lexicalist in inclination, as well as hooking in nicely to the current interest in dependencies as a syntactico-semantic representation.
2012
Dennis N. Mehay and Chris Brew (2012) CCG Syntactic Reordering Models for Phrase-based Machine Translation. In Proceedings of WMT-2012, Montreal, Canada.
2011
Stephen Boxwell, Chris Brew, Jason Baldridge, Dennis Mehay and Sujith Ravi. 2011. Semantic Role Labeling for CCG Without Treebanks? Proceedings of the International Joint Conference on Natural Language Processing (IJCNLP). Chiang Mai, Thailand.
2010
Stephen A. Boxwell, Dennis Mehay, and Chris Brew. 2010. What a Parser can Learn from a Semantic Role Labeler and Vice-Versa. 2010 Conference on Emperical Methods in Natural Language Processing (EMNLP-2010). MIT, Cambridge, MA, USA.
Boxwell, Stephen and Chris Brew (2010): A Pilot Arabic CCGbank. 7th International Conference on Language Resources and Evaluation, Valetta, Malta, 17-20 May 2010
1995
Brew, Chris (1995). Stochastic HPSG. In Proceedings of the 7th Conference of the European Chapter of the Association for Computational Linguistics, pages 83–89, Dublin, Ireland, March 28–31. University College Dublin.
Grover, C., Brew, C., Manandhar, S., Moens, M., and Schoter, A. (1995). Priority union and generalization in discourse grammar. In A. Schoter and C. Vogel, editors, Edinburgh Working Papers in Cognitive Science, Vol. 10: Nonclassical feature systems, pages 157–196. University of Edinburgh.
1994
Grover, Claire, Chris Brew, Marc Moens, and Suresh Manandhar (1994) Priority union and generalisation in discourse grammar. In Proceedings of the 32nd Annual Meeting of the Association for Computational Linguistics, pages 17–24, Las Cruces, New Mexico.
Brew, Chris 1994. Comments on Eisele: types and clauses: two styles of probabilistic processing in CUF. In Jochen Dorre,editor, Computational Aspects of Constraint-Based Linguistic Description, Vol. II. ILLC/Department of Philosophy, University of Amsterdam, pages 23–28. DYANA-2 Deliverable R1.2.B.
Brew, Chris 1993. Adding preferences to CUF. In Jochen Dörre, editor, Computational Aspects of Constraint-Based Linguistics Description, Vol. 1. ILLC/Department of Philosophy, University of Amsterdam, August. DYANA-2 Deliverable R1.2.A.
1992
Brew, Chris (1992). Letting the Cat out of the Bag: generation for shake-and-bake MT. In Proceedings of the 14th International Conference on Computational Linguistics, pages 610–616, Nantes, France.
Dennis Mehay (tbd.)
Bean Soup Translation: Flexible, Linguistically-motivated Syntax for Machine Translation
The main contribution of this thesis is to use the flexible syntax of Combinatory Categorial Grammar [CCG, Steedman, 2000] as the basis for deriving syntactic constituent labels for target strings in phrase-based systems, providing CCG labels for many target strings that traditional syntactic theories struggle to describe. These CCG labels are used to train novel syntax-based reordering and language models, which efficiently describe translation reordering patterns, as well as assess the grammaticality of target translations. The models are easily incorporated into phrase-based systems with minimal disruption to existing technology and achieve superior automatic metric scores and human evaluation ratings over a strong phrase-based baseline, as well as over syntax-based techniques that do not use CCG.
Stephen Boxwell (DoD, Fort Meade)
A CCG-Based Method for Training a Semantic Role Labeler in the Absence of Explicit Syntactic Training Data
Treebanks are a necessary prerequisite for many NLP tasks, including, but not limited to, semantic role labeling. For many languages, however, treebanks are either nonexistent or too small to be useful. Time-critical applications may require rapid deployment of natural language software for a new critical language { much faster than the development time of a traditional treebank. This dissertation describes a method for generating a treebank and training syntactic and semantic models using only semantic training information. that is, no human-annotated syntactic training data whatsoever. This will greatly increase the speed of development of natural language tools for new critical languages in exchange for a modest drop in overall accuracy. Using Combinatory Categorial Grammar (CCG) in concert with Propbank semantic role annotations allows us to accurately predict lexical categories in combination with a partially hidden Markov model. By training the Berkeley parser on our generated syntactic data, we can achieve SRL performance of 65.5% without using a treebank, as opposed to 74% using the same feature set with gold-standard data.
David Tugwell , ITRI, Brighton
Dynamic Syntax
In this thesis, I shall argue that dynamic "left-to-right" grammars have been undeservedly neglects as models of natural language syntax, and that they allow more general and elegant syntactic descriptions than has previously been appreciated. ... I propose a concrete example of a left-to-right model of syntax.