Database in the CWC

Julie Kloss '94

The staff of the Coe College Writing Center consists of 25-30 undergraduate "writing consultants" (our term for peer tutors) who work with approximately 350 different students each year. In addition to several thousand informal contacts with students (we have a popular coffee shop, the only free laser printer on campus, and frequent informal paper-writing discussions that remain relatively unstructured), we average 1200-1300 formal conferences annually, sessions in which both the student and the consultant meet with a mutual awareness they are having a conference. A record of each formal conference is entered onto a Consultant Conference Form, affectionately dubbed the "green sheet." These green sheets provide the raw information that feeds our database, our electronic filing cabinet that now has records for over 6000 conferences in the past five years. Although the database was intended to make various bits of information readily accessible, it has proven to be a useful tool for more than simple record keeping. This paper will attempt to explain how the database is organized and provide some examples of the value of the database for improving our work with students.

We maintain the conference data for several purposes. Our primary reason for the database is to improve our service to students. The database enables us keep track of what has been covered in previous conferences and perhaps what effect our assistance is having. On occasion, faculty members want to know what issues were explored in a conference. The database can dispense this information quickly. It has also become apparent that the existence of the database helps to establish our credibility with some faculty and administrator, helping to convince a few doubting Thomases that there is an authenticity and justification to our work. With the touch of a few keys, we can obtain a written record of a conference, or a printout describing assistance given a group of students from a professor's class. In several cases concerning suspected plagiarism, for example, it has been helpful to us, and the student, to clarify for a faculty member exactly what happened in a sequence of conferences.

Of more frequent significance, however, is the value of the database for helping writing consultants gain a sharper impression of their efforts as peer tutors. Staff members need to take a few minutes after a conference to reconsider what happened in the session, what strategies were adopted, what worked, what didn't succeed, what was learned in the process. And since the information from the forms will be entered into the database, the transferal process enables other staff members to learn what has been happening in unobserved conferences. The simple act of maintaining the system ensures that information about conferences is distributed among a wide net of staff members.

At some time during their tenure in the Center, staff members will be required to use the database for conducting a self-study of their work, analyzing the patterns of data recorded in their "green forms" and how those patterns compare with the consulting habits of their fellow staff members. While most self-studies are private, in-house operations, these explorations of consulting patterns have also been the basis for several off-campus presentations at regional and national conferences. These formal venues encourage the staff to discuss implications of their data with a much broader and more varied audience.

The development of our staff's self-awareness revolves around both individual consultants reflecting on their unique histories as writing center personnel and groups of consultants studying staff-wide tutoring habits. From the database research, configurations of evidence show how we operate collectively--and how individuals may tend to diverge from these group patterns. The database is an important self-reflective tool for enabling the individual staff members to become aware of themselves as professionals. The completion of the green sheets requires attention to objectives for the conference as well as an aid to reflection, forcing each consultant to reconsider tutoring strategies adopted in a conference and how the student-writer responded to them.

Although the reasons for using the database may be compelling, the system will not work if the record keeping is not simple and practical. The system needs to produce a reasonably thorough and accurate description of our activities. We have also felt that the database should be allowed to evolve over time, addressing different issues according to the interests of the staff and our perceptions of the services we offer. For these reasons, the green sheets are revised every one or two years, in an attempt to assure us maximum practicality while exploring dimensions of conferences we had previously not considered. In its latest manifestation the green sheet takes approximately five minutes to complete, recording a mixture of descriptive and analytical data. (Appendix A reproduces the form that will be used for the 1994-95 school year). Descriptive, objective information includes the student's name, year in school, course, instructor, assignment, date of conference, and the consultant's name. The analytical, interpretive commentary is where the consultant identifies what assistance was requested, the session's primary foci, and the kind of actions taken during the conference. After entering this information (some of which is recorded by responding to a series of multiple choice questions), consultants provide a one-paragraph written narrative describing the most important aspects of the conference.

After the objective information is obtained, the assistance requested is the first interpretive section on the green sheet. This information is usually obtained by asking students what help they want, why they came to the Writing Center, or what they want the consultant to look for. Eight possibilities for this category are:

  1. None: The student gave no indication of assistance desired.
  2. Any/Everything: Student indicated that the consultant could look for any kind of problems.
  3. Brainstorm: Student wanted help with generating ideas for a paper.
  4. Discussion: Student wanted to talk, not focusing on a text.
  5. Understand Assignment: Students sought clarification on an instructor's assignment.
  6. Revision Issues: Student asked for help revising paper.
  7. Edit/Proofread: Student asked for help with sentence-level problems (grammar, punctuation, sentence structure, mechanics of writing, etc.)
  8. Teacher Comments: Student sought help in understanding or using instructor's responses to a paper.

In the green sheet's second category, the conference foci, the consultants characterize the direction in which the conference progressed. Among the twelve possibilities, the consultants identify a maximum of two primary foci:

  1. Understand Assignment: Significant portion of discussion dealt with the student's understanding of what the instructor was asking for.
  2. Brainstorm: Discussion focused on generating ideas for starting or developing a paper. (3)
  3. Clarify Ideas: Emphasis on helping student define, explain, elucidate, or illustrate ideas in a draft or for a draft.
  4. Structure/Organization: Major concern for how the parts of the paper were arranged.
  5. Teacher Comments: Student obtained assistance in understanding how to respond to teacher comments, directions, suggestions, or correction on a draft.
  6. Expand/Trim Draft: Conference worked on increasing or reducing length/development of the paper.
  7. Improve Style/Sentence Structure: Time was spent with individual sentences to sharpen writing style.
  8. Proofread/Edit/Grammar: Concern for problems with individual words or phrases, including grammar errors.
  9. Library/Format: Conference dealt with library research, bibliography, and documentation procedures, or format and appearance of the paper.
  10. Reading Comprehension: Time was spent helping student understand texts being dealt with in a paper or course assignment.
  11. Discussion: Free talk, exploring issues beyond the boundaries of student text.
  12. Other: Focal point not covered by the first 11 categories.

The green sheet's next section attempts to help the staff identify the kinds of action taken during the conference. Also in this category is a record of the consultant's impressions of the conference as a positive or negative experience. In the 1993-1994 version there were five choices for this category.

  1. Discussion/Brainstorm: Time was spent talking but few specific steps were taken in changing the text or doing any writing.
  2. Revised Draft: During the conference, specific steps were taken that resulted in changes in the draft beyond sentence or word choice modifications.
  3. Edited Draft: During the conference, specific steps were taken that concentrated on sentence-level changes.
  4. Plan for future work: Plans were made for revising or editing text but the work was left for a time outside the conference.
  5. Other: Actions which do not fit under previous four categories.

Plus(+) or Minus(-) Conference: Consultant's evaluation of the relative success (+) or failure (-) of the conference.

The conference summary form concludes with a space for the consultant to write a narrative summation and commentary on each conference. Often recorded are problems with the conference, problems with the paper, clarification of what happened in the conference, questions, or concerns that arose during the conference.

Once the green sheets are completed, all information is entered into the computer. For our database, we continue to use PC-File:dB, a Jim Button Shareware program. Although it is a simple, "outdated" program, we have remained with this software because it requires minimal training for entry and data manipulation. The green sheet is designed for the easy transfer of information into the database, and once in the database, the information can be sorted and studied from several approaches. To date, our analyses have remained relatively simple, and we have never attempted any type of sophisticated statistical inquiries. On the current staff are several peer consultants who are computer science or math majors and will be working in the writing center for the next two years. We hope they will enable us to adopt some more rigorous procedures for studying our accumulated data; but until now, we have been content to simply "read" the numbers and develop our analyses relying primarily on instinct and common sense.

Despite our limited, unscientific techniques, the data have still provided us with a rich array of possibilities for examining and reconsidering our work with students. Analysis of growth can be seen through the total number of conferences year to year. Elementary totals can help us determine where we as a Writing Center tend to operate in the categories of assistance requested, conference focus, and conference action. During the past year, our attention has focused on two of the multiple choice categories: assistance requested and conference foci.

The intermixture of this small number of categories provides for countless possibilities in analysis. For example, one issue we studied this past year dealt with our expectations concerning upper classmen coming to the Writing Center. We hypothesized that this group would arrive at the Writing Center with relatively precise requests for the types of assistance that they were seeking and the resulting conference tendencies should differ from the conferences with first-year students. Older students are more likely to have visited the Writing Center previously and should be familiar with our services. Also they should be more capable of understanding their own writing problems and determining what needs attention. However, when we looked at the data for upper classmen, there was not a significant difference in the assistance requested. (See Chart A) Regardless of the writer's year in school, a substantial plurality of students came to the writing center apparently not seeking specific revising or editing assistance, evident by nearly 40% of students asking for "Anything or Everything." This imbalance resulted in relatively small percentage of conferences where the consultant felt the writer was specifically asking for assistance with Editing/Proofreading, Brainstorming, Discussion, Understanding Teacher Comments, and Understanding Assignment.

Chart A

Requests for Assistance: Fall 1993 (535 Conferences)
Category Freshmen Sophomores Juniors Seniors Average
None 6% 3% 3% 8% 5%
Any/Everything 40% 33% 37% 37% 38%
Brainstorm 5% 6% 1% 5% 5%
Discussion 8% 8% 7% 8% 8%
Understand Assig 1% 1% 0% 0% 1%
Revision 24% 33% 25% 25% 26%
Edit/Proof 11% 13% 25% 10% 13%
Teacher Comm 6% 3% 3% 6% 5%

As the staff discussed these numbers, several explanations were offered for why the "any/everything" request was so often identified. Some staff felt the number signified the inadequate questioning techniques used by consultants, the failure to really find out what a student writer was seeking. The relatively low number of requests for editing and proofreading may also signify a reluctance of the staff to focus on grammar issues. The small numbers for Brainstorming, Discussion, and Understanding Assignment probably derives from the fact that most students have produced a draft prior to the conference.

The data reported in Chart A surprised the staff because they expected that there would be some significant differences in assistance requests between the freshmen and seniors. Coe's students average over eight writing emphasis courses during their four years in college; the expectation was that the frequency of writing assignments would lead the upper-classmen to be more specific in their requests for conferences, particularly seeking assistance with either editing/proofreading or revision issues, such as clarification of ideas and improving a paper's organization and structure. And since our writing center employs freshmen and sophomores as peer consultants, we also expected to find significant differences between the consulting tendencies of the younger group and the more experienced staff. In each case, the numbers did not confirm our expectations.

With regard to the foci of conferences conducted by under-class consultants (freshmen and sophomores) versus those done by upper-classmen, we found few, if any significant differences. (See Chart B.)

Chart B

Conference Foci Reported by Consultants: Fall 1993 (535 Conferences)
Category 8 Freshman Consultants 7 Soph. Consultants 4 Junior Consultants 7 Senior Consultants Average
Understand Assig 0% 3% 1% 3% 2%
Brainstorm 6% 4% 1% 7% 6%
Clarify Ideas 17% 15% 25% 20% 20%
Structure/Org 10% 10% 21% 13% 13%
Teacher Comm 5% 6% 4% 5% 5%
Expand/Trim Draft 12% 6% 10% 10% 10%
Style/Sent Struc 11% 22% 6% 10% 12%
Proof/Editing 20% 13% 10% 11% 13%
Library/Format 1% 2% 0% 0% 1%
Reading Comp 9% 7% 6% 8% 4%
Discussion 14% 7% 6% 8% 9%
Other 3% 10% 2% 5% 4%

It is true that first-year students working in the Writing Center tend to focus more on sentence-level issues than upper-classmen. Thus 20% of writing conferences conducted by freshmen had proofreading and editing as one of the two primary foci for 20% of their conferences; in contrast only 11% of conferences conducted by seniors identified that as a primary foci. But even that statistic needs to be qualified when we notice that freshmen were more likely to have discussion as a primary focus in 14% of their conferences, twice as high as the percentages reported by the sophomore/junior/senior consultants. As the staff discussed these numbers, most felt that first-year consultants would often focus on grammar/sentence level issues in their initial conferences but by the middle of their first term, they were usually conducting conferences in a style quite similar to the upper-classmen.

While most assistance requests could be grouped into three categories (request for any/everything, revising, or editing & proofreading), the focal points of conferences appeared to be dispersed fairly evenly among four categories: Clarifying Ideas, Strengthening Structure/Organization, Improving Style/Sentence Structure, and Expanding/Trimming Draft. These numbers suggest that consultants prefer to bypass grammar issues in most conferences and to concentrate on style, organization, and development of ideas. This would be consistent with our goal of not becoming a Proofreading Center--but rather encouraging students to deal with issues beyond sentence-level errors.

As for conferences with international students (including some non-degree students in the college's ESL program), we expected these sessions to be different from the conferences we have with students for whom English is their primary language. We assumed that these papers would require more careful attention on grammar, punctuation, sentence structure, word choice. In their requests, the ESL students asked for assistance in patterns quite similar to those reported in Chart A: Any/Everything (36%), Revision Issues (26%), and Edit/Proofread (26%). Only with editing/proofreading did we find a figure noticeably higher than the population as a whole (which was 15%). We were surprised that there was not a greater frequency of requests for understanding Teacher Comments (2%) and the Assignment (1%). We weren't sure if this indicated that faculty were being very clear with their directions and recommendations or that the international students were reluctant to ask for assistance in these areas.

Chart C provides data on the focal points of conferences with over 50 different international students the Writing Center worked with during a two-year period.

Chart C

Conference Foci for ESL Students, 1992 & 1993
Category 1992 1993
Understand Assign 4% 3%
Brainstorm 2% 4%
Clarify Ideas 18% 18%
Structure/Org 14% 12%
Teacher Comm 2% 1%
Expand/Trim Draft 10% 12%
Style/Sent Struc 11% 15%
Proof/Editing 20% 21%
Library/Format 1% 1%
Reading Comp 10% 2%
Discussion 3% 10%
Other 5% 1%

The patterns of conference foci resemble the distributions reported in Chart B for all students. Perhaps the biggest surprise in these numbers was the small number of conferences that focused on helping ESL students comprehend the texts that they were using in their courses. In previous years we had always had a substantial number of conferences helping international students learn how to read their reading assignments, but in the fall of 1993 only 2% of the conferences focused on this issue.

In order to learn from our study of the database, we must take this knowledge off the paper and apply it to our ever changing strategies for the continual training of our staff. As our self-understanding improves, we can become a more adaptive and responsive writing center. The data will often be ambiguous: does the low number of reading comprehension conferences mean that the international student body has changed or that we have not adequately prepared the staff to recognize reading problems and respond to them? The numbers don't provide us with an answer. But the numbers do help us know what questions to ask about ourselves, what training to provide for incoming writing consultants, what expectations they might have as they begin their work with fellow writers. We think the database provides us with one additional edge for making sure we provide the best service we are capable of offering.

Comments