Some patterns are learned more readily than others. Exactly what patterns are more learnable and why?
Previous literature has shown that properties of natural languages lead to better learning of artificial grammars (Culbertson and Newport, 2017; Thompson and Newport, 2007; Getz and Newport, 2019; Fetch, 2020). Traditionally, Syntactic movement has been seen as a challenge the learner must overcome- but in my dissertation, I argued that it is a property of natural languages that actually helps make them more learnable! Movement in natural languages is highly constrained and therefore impacts the distributional cues in the input in a particular way. These cues might allow learners to constrain their hypothesis space and acquire their grammar more readily.
In my dissertation, I asked whether movement principles found in natural languages impact the learning of word order in an artificial grammar and found that the answer was yes!
How does the structure of the input influence learners' generalizations?
Children acquire a grammar that can produce an infinite number of sentences from a finite set of sentences. Some sentences may be more informative for learners than others. For example, according to Generative Syntactic Theory (Chomsky, 1957; 1981; 1995; 2001) sentences contain words or phrases that have been moved. Some movement operations change the surface order of words or phrases in a sentence while others do not. Do learners need to be exposed to sentences that are transparent to the order of constituents prior to movement operations in order to acquire syntactic structure?
In my dissertation, I manipulated learners exposure to sentences of different types to investigate whether exposure to certain kinds of sentences is necessary for learning. I predicted this should not be the case based on the kinds of sentences children hear in naturalistic learning contexts and my data confirmed this prediction.
To what extent do abstract representations play a role in the acquisition of grammatical knowledge?
From the prototype literature in psychology, it appears that learners are able to extract shared properties from variable stimuli when the variation is statistically conditioned (Posner, 1964; Posner, Goldsmith, and Welton, 1967; Posner and Keele, 1968). The distributional patterns that appear across sentences of different types are the result of principled and constrained grammars. Perhaps exposure to sentences of different types allow learners to construct a prototypical sentence or structure that can be used to acquire syntactic structure.
In my dissertation, I withheld exposure to the sentence type that all other sentences were generated from in an artifical grammar. I found that learners were still able to acquire this structure. More work is needed to determine the nature of learner's representations, but this preliminary finding suggests that learners may be able to represent an abstract structure when exposed to variations of that structure.
Kotfila, J., The Acquisition of Word Order: from Strings to Sentences. Ph.D. Dissertation.
Kotfila, J., Getz, H., Mechanisms of Learning Syntax.
de Villiers, J., Kotfila, J., & Roeper, T. (2020). When Is Recursion Easier for Children?. In New Trends in Language Acquisition Within the Generative Perspective (pp. 239-256). Springer, Dordrecht.
Kotfila, J., & de Villiers, J.(2019). When Must Children Acquire Long Distance Wh-Extraction?. Proceedings of the 43rd Boston
University Conference on Language Development. Cascadilla press. Somerville, MA.
de Villiers, J., Kotfila, J., & Klein, M. (2019). Parsing, pragmatics, and representation. In Three Streams of Generative Language Acquisition Research: Selected papers from the 7th Meeting of Generative Approaches to Language Acquisition–North America, University of Illinois at Urbana-Champaign (Vol. 63, p. 85). John Benjamins Publishing Company
Kotfila, J. A. (2018). The What and the Why of Wh-movement in the Child's Grammar: A New Perspective Through Epistemic and Deontic modality. B. A. Thesis, Smith College.