NLI Shared Task 2013

The first edition of a shared task on Native Language Identification (NLI) will take place at the BEA-8 workshop. The shared task will be organized by Joel Tetreault, Aoife Cahill, and Daniel Blanchard. NLI is the task of identifying the native language (L1) of a writer based solely on a sample of their writing. The task is typically framed as a classification problem where the set of L1s is known a priori. Most work has focused on identifying the native language of writers learning English as a second language. To date this topic has motivated several ACL and EMNLP papers, as well as a master’s thesis.

Native Language Identification (NLI) can be useful for a number of applications. In educational settings, NLI can be used to provide more targeted feedback to language learners about their errors. It is well known that learners of different languages make different errors depending on their L1s. A writing tutor system which can detect the native language of the learner will be able to tailor the feedback about the error and contrast it with common properties of the learner’s language. In addition, native language is often used as a feature that goes into authorship profiling, which is frequently used in forensic linguistics.

Announcements

  • April 04, 2013 Update We sent an email out to all participating teams containing the following:1. Significance results between best-systems2. Updated Paper Submission information3. Poster Dimensions information4. Registration Information
    Posted Apr 5, 2013, 12:18 AM by Joel Tetreault
  • NLI Results Released! For all 29 participating teams, we sent two results emails.  The first was an automatically generated email that was sent to each team leader with statistics, results and confusion tables ...
    Posted Mar 21, 2013, 11:40 AM by Joel Tetreault
  • Test Data (Re-) Released Due to an issue with the test data, we are re-releasing a corrected test corpus.  As a result of the delay we are pushing back the deadline for system ...
    Posted Mar 11, 2013, 5:04 PM by Joel Tetreault
  • Revised NDA sent to all team leaders We have emailed a revised NDA to all registered team leaders to account for the additional data being released. In order to receive the test data on Monday, March 11 ...
    Posted Mar 7, 2013, 2:32 PM by Aoife Cahill
  • Evaluation Script Updated To prevent any potential issues with people not having their submission files sorted before passing them to evaluation.py, I have updated the script to do the sorting internally and ...
    Posted Feb 19, 2013, 8:31 AM by Daniel Blanchard
Showing posts 1 - 5 of 9. View more »
DescriptionDate
Training Data Release January 14, 2013 
Test Data Release March 11, 2013 
Submissions Due March 19, 2013 
Results Announced March 26, 2013 
Papers Due April 8, 2013 
Revision Requests Sent April 10, 2013 
Camera Ready Version Due April 12, 2013 
Showing 7 items from page Important Dates sorted by Date, Description. View more »