Call for Papers

Call for Papers

EXTENDED SUBMISSION DEADLINE: 26 April 2019 1 May 2019

submission URL: https://www.softconf.com/acl2019/delfol/


While deep learning and neural networks have revolutionized the field of natural language processing, changed the habits of its practitioners and opened up new research directions, many aspects of the inner workings of deep neural networks remain unknown.


At the same time, we have access to many decades of accumulated knowledge on formal languages, grammar, and transductions, both weighted and unweighted and for strings as well as trees: closure properties, computational complexity of various operations, relationships between various classes of them, and many empirical and theoretical results on their learnability.


The goal of this workshop is to bring researchers together who are interested in how our understanding of formal languages can contribute to the understanding and design of neural network architectures for natural language processing. For example, fundamental work on neural nets has examined whether they could learn different classes of formal languages, and reciprocally whether formal grammars or automata could closely approximate neural networks. Recently we have seen new research directions on what each formalism can bring to understand or improve the other. Topics which fall within the purview of the workshop include, but are not limited to


  • Learnability of formal languages with neural nets (both strong and weak learning)
  • Relationship between deep learning models and linguistically inspired formalisms
  • Connections between neural network architectures and classical computational models
  • Traditional formal grammars augmented through non-linearity
  • Hybrid models combining neural networks and finite state machines
  • The use of formal grammars to analyze and interpret the behavior of neural networks
  • Approximating neural networks with weighted automata and grammars
  • Including formal grammar constraints as symbolic priors in neural networks


The programme will consist of invited speakers, presentations of selected contributions and a poster-session for discussion. We are delighted to announce the following list of speakers:

  • Rémi Eyraud (Aix-Marseilles University)
  • Robert Frank (Yale University)
  • John Kelleher (Technological University Dublin)
  • Kevin Knight (Didi)
  • Ariadna Quattoni (dMetrics)
  • Noah Smith (University of Washington)


We call for three types of papers:

(1) Regular workshop paper: original and unpublished work in the context of this workshop, up to 8 pages (excluding references). Will be included in the workshop proceedings.

(2) Extended abstracts: on-going work, intermediate results or positions papers. Up to 4 pages (excluding references), which will not be included in the proceedings.

(3) Cross-submissions: relevant work published elsewhere in the last 12 months. Together with the pdf, you will be asked at submission time the reference of the original paper. Will not be included in the workshop proceedings.

Paper should be written using the ACL 2019 style files listed in the ACL call for papers and submitted through softconf.

Note that we will be able to offer travel grants to some students, thanks to generous support from Naver Labs.


Some recent work which falls within the scope of this call include:


Bridging CNNs, RNNs, and Weighted Finite-State Machines. Roy Schwartz, Sam Thomson,and Noah A Smith. (ACL 2018)Rational Recurrences. Hao Peng, Roy Schwartz, Sam Thomson, Noah A. Smith. (ENMLP 2018)Recurrent Neural Networks as Weighted Language Recognizers. Y. Chen, S. Gilroy, A. Maletti, J. May, and K. Knight. (NAACL 2018)Using Regular Languages to Explore the Representational Capacity of Recurrent Neural Architectures. Abhijit Mahalunkar and John D. Kelleher. (ICANN 2018)Explaining black boxes on sequential data using weighted automata. Stéphane Ayache, Rémi Eyraud and Noé Goudian. (ICGI 2018)Extracting Automata from Recurrent Neural Networks Using Queries and Counterexamples. Gail Weiss, Yoav Goldberg, and Eran Yahav. (ICML 2018)Generalized Earley Parser: Bridging Symbolic Grammars and Sequence Data for Future Prediction. Siyuan Qi, Baoxiong Jia, and Song-Chun Zhu. (ICML 2018)Efficient Gradient Computation for Structured Output Learning with Rational and Tropical Losses. Corinna Cortes, Vitaly Kuznetsov, Mehryar Mohri, Dmitry Storcheus, Scott Yang (NIPS 2018)Composing RNNs and FSTs for Small Data: Recovering Missing Characters in Old Hawaiian Text. Oiwi Parker Jones and Brendan Shillingford (IRASL workshop at NIPS 2018)Verification of Recurrent Neural Networks Through Rule Extraction. Q Wang, K Zhang, X Liu, and CL Giles (arxiv.org 2018)A Comparison of Rule Extraction for Different Recurrent Neural Network Models and Grammatical Complexity. Q Wang, K Zhang, II Ororbia, G Alexander, X Xing, X Liu, CL Giles (arxiv.org 2018)Grammar Variational Autoencoder. Matt J. Kusner, Brooks Paige, José Miguel Hernández-Lobato. (ICML 2017)Subregular Complexity and Deep Learning. Enes Avcu, Chihiro Shibata, and Jeffrey Heinz. (LAML 2017)Recurrent Neural Network Grammars. Chris Dyer, Adhiguna Kuncoro, Miguel Ballesteros, and Noah A. Smith. (NAACL 2016).Weighting finite-state transductions with neural context. Pushpendre Rastogi, Ryan Cotterell, and Jason Eisner (NAACL 2016)Connecting Weighted Automata and Recurrent Neural Networks Through Spectral Learning. G. Rabusseau, T. Li and D. Precup. (AISTATS 2019)