Tracking Student Needs with ChatGPT
Will ChatGPT help me evaluate and provide recommendations for future student development?
Will ChatGPT help me evaluate and provide recommendations for future student development?
By Lowell Sanborn
April, 21, 2023
The presence of ChatGPT as a tool for language learning seems intriguing. As an English Language Educator it sounds compelling. A quick conversation with the new generative AI will quickly give educators a list of its possible uses.
As an AI language model, I can help English language teachers in a variety of ways. Here are some examples:
Language practice and correction: I can provide language practice exercises, including grammar and vocabulary exercises, for students to practice their English skills. I can also correct their language errors and provide feedback.
Lesson planning: I can assist with lesson planning by providing resources and ideas for teaching topics, activities, and assessments.
Pronunciation and accent training: I can provide exercises and resources for improving pronunciation and reducing accents for students who are non-native speakers of English.
Language assessment: I can assist in evaluating students' language proficiency levels and progress, providing recommendations for further learning, and assessing students' writing and speaking skills.
Resource recommendations: I can suggest relevant reading materials, videos, and podcasts for students to improve their English comprehension and vocabulary.
Translation: I can translate text and provide language assistance for non-native speakers of English.
Overall, I can be a valuable resource for English language teachers, providing personalized support and assistance to help students improve their language skills.
Of particular interest, suggestion 4. ‘Language Assessment’, states that ChatGPT can help in “evaluating students’ language proficiency levels and progress, providing recommendations for further learning, and assessing students’ writing and speaking skills.” Identifying and providing future activities is something our courses could develop further. Let’s see how this goes.
I typed in “can you evaluate my student's writing, determine their level using CEFR framework, and then suggest future activities to develop my student's English skills?” After the message affirming it could in fact help me out, I copied some of my student’s work into the chat box. The work was from an old writing class. The student had received a perfect score based on my grading criteria. I thought the work was sufficient to make their point in this opinion essay that required citations. The main objective of the assignment was to use sources to help justify an opinion. The pertinence of the sourced material being the main objective of the assignment, the student may have wished to receive more linguistically focused feedback. This could be a useful place for some of what ChatGPT is suggesting.
It seems that ChatGPT similarly found the point to be effectively made. Not that I needed the pat on the back, but hey I’ll take it. After pasting in my student’s writing, I was happy to see the following response.
Based on the sample of writing you provided, I would assess your student's writing proficiency level using the CEFR framework as B2 (upper-intermediate).
In terms of grammar, your student demonstrated a good command of complex sentence structures and used appropriate verb tenses throughout the text. There were a few minor errors in subject-verb agreement and preposition use, but overall, the grammar was solid.
In terms of vocabulary, your student used a variety of words and phrases to express their ideas clearly. They also used appropriate collocations and idiomatic expressions throughout the text.
In terms of coherence, your student organized their ideas effectively, using clear topic sentences and supporting details to develop their arguments. The text flowed smoothly from one idea to the next and showed a good understanding of paragraph structure.
Based on this evaluation, I would suggest that your student continue to focus on developing their writing skills in English. Some activities that could help them improve their writing skills at the B2 level might include:
Reading and analyzing a variety of texts, including news articles, opinion pieces, and academic essays, to develop critical thinking skills and expand vocabulary.
Practicing writing in different genres, such as argumentative essays, research papers, and creative writing pieces, to develop versatility and adaptability in writing.
Participating in peer review and revision exercises to receive feedback on their writing and improve their editing skills.
Engaging in online writing communities or attending writing workshops to receive guidance and feedback from professional writers and other learners.
Continuing to read a variety of literature, including fantasy and science-fiction novels, to develop critical thinking, creativity, and emotional intelligence.
Overall, your student has demonstrated a strong foundation in writing in English, and with continued practice and exposure to different types of texts and writing genres, they can continue to develop their writing proficiency and critical thinking skills.
Besides the pat on the back, I was underwhelmed with the suggestions that the program suggested. The future suggestions refer to things all learners of all levels could do to improve their writing. That said, someone toiling over a direction to pursue to further more advanced writer’s abilities may require a nudge in the right direction. This could be the thing. If you were looking for confirmation of good writing, this may not be necessary. If you were looking for some help classifying your student’s language level and were unfamiliar with the Common European Framework of Reference for Languages (CEFR) benchmarks, this gives you some idea of the nearest benchmark, though it does not reference specifically what was evaluated or where precisely the AI sourced the level of proficiency descriptors. Maybe this helps a student or teacher identify something of value.
Let’s try a student who might need more detailed instruction to write more clearly. I went back to the well of old student writings from the same assignment and found a student who scored poorly but had between the suggested 150-200 words for the bot to analyze.
At first, I had just simply pasted in the text. Not realizing I hadn’t prompted the AI again it summarized the short written work… with around the same number of words. So I asked it to evaluate the level of the work using the CEFR and to suggest some further activities to improve. Here’s what it said.
Based on the CEFR framework, I would evaluate your writing in the second text as B1 level. You use mostly simple and compound sentence structures, and your vocabulary and grammar are generally accurate, but with occasional errors. Your writing is also clear and easy to understand, with coherent organization and development of ideas.
To further develop your English skills, I would recommend focusing on improving your grammar and vocabulary, particularly in the areas where you tend to make errors. It would also be helpful to practice writing more complex sentence structures and using more varied vocabulary. Additionally, you could try reading and analyzing more complex texts to improve your critical thinking and writing skills.
Again the proficiency level could be of some use, though for students it’s not clear if they would be well-equipped to benefit from the level. For teachers they may be able to use this to find new resources online (like here from Cambridge) or develop new resources basd on some level-specific criteria.
Otherwise, the suggestions are likewise crude and not necessarily something I see myself needing. Improving the grammar skills of less proficient learners is kind of a cop-out and referring to “the areas where you tend to make errors” does not seem very helpful.
ChatGPT seems to have some use in determining student ability levels, but does not specifically identify the issues in student writing. In the first case, subject-verb agreement was isolated as one area to improve. Let’s see if ChatGPT can identify the areas that caused it to determine B1 for the second author.
Returning to the chat I asked ChatGPT to tell me why the second author received a lower proficiency level.
Based on the writing sample you provided, I evaluated it as CEFR level B1 because the writer is able to communicate their ideas and opinions with a certain degree of coherence and clarity, using a mix of simple and more complex sentence structures. The writer also demonstrates some ability to organize their ideas, although there are some instances where the organization could be improved for greater clarity.
At the B1 level, writers are expected to be able to express personal opinions, describe experiences, and give reasons and explanations for their views. The writer in this sample does all of these things, offering opinions on the relationship between money and happiness, the problem of unemployment, and the challenge of finding a job related to one's major. They also support their opinions with specific examples and statistics, indicating a level of detail and accuracy that is consistent with the B1 level.
Overall, while there are some areas where the writing could be improved, such as greater clarity and organization, the writer demonstrates a solid command of basic grammatical structures and vocabulary, as well as the ability to convey their ideas in writing.
The clarity of the explanation was cited as the reason for the evaluation. The descriptors tracing back to the CEFR level check out. So this can be helpful. The avoidance of grammatical errors and the inability to highlight specific instances of note is what I was really looking for. This could require some more exploration, but for now, I think I will continue to evaluate student writing without generative AI assistance. Though some more targeted requests may explore the writing more thoroughly. It’s not like the AI knows I am not impressed yet by it’s ability to help me teach… I hope.