Making course evaluations work

Two weeks ago, I expressed my disappointment with the course evaluation system currently found on ACES. My argument was that without qualitative data from students' written comments, the reams of quantitative data provided are all but useless to the undergraduate student body.

This week, I'd like to offer up my own vision of what a course evaluation should look like. Although I cannot flesh out every part of the argument, I'll do my best to offer as much detail as possible.

To begin with, I think ranking any aspect of the course on a 1-to-5 scale is generally a waste of time. With numerical rankings, a student looking at the collected data has no idea in what context to take these numbers. Could it be that the professor was really interesting but the course material painfully dry? Perhaps the material was difficult but the professor did not demand its complete mastery. Or maybe the readings were elementary but the tests impossibly difficult and completely lecture-based. Whatever the case, one cannot discern which of these possibilities is at work from numerical rankings alone.

The power of a well-presented argument cannot be ignored; a written comments section would allow students to outline carefully their opinion on the course. Thus, even if a student were to provide numerical rankings along with the evaluation, it would be easy to put these numbers in their proper context by reading the written comments section.

This brings us to the question of how to collect, transcribe and publish the evaluations. Retyping paper forms obviously wouldn't cut it due to time and cost. Therefore, I propose a simpler solution: Tie in course evaluations with class registration for the coming semester. A link to an online course evaluation form would appear on students' schedules a week before finals, and students would not be allowed to register for the next semester's classes until they fill out the course evaluation form for courses taken the previous semester. A harsher but more expedient version of this idea would be to block a student's ability to check grades until the course evaluation is filled out. Such an approach would not only guarantee a high response rate, it would also eliminate the problem EZDevil.com currently has--that it is unlikely that students who have not taken a certain class could fill out the course evaluation for that class.

At a top-notch university that consistently touts its commitment to academics, this proposal shouldn't face a lot of opposition. Moreover, the students should be especially receptive to this idea because the only way course evaluations will ever be useful is if everyone takes them seriously.

How would this information be presented to students in the coming semester? I like the way the current system has a link below the course description--that should stay. What should change is what one sees after clicking on the link.

Although students' entries should be anonymous, the grade they received should not. Therefore, if someone has nothing nice to say about the course, you can take their comments with a grain of salt if they received a C-. If, however, a student who earned an A thinks the course was horrible, there's probably something else at work there worth noting.

The last remaining issue is the content of the written comments. How will anyone know that the students are speaking the truth? The best way to deal with this is to have a separate screen where students are asked not to use profane language and to only give their honest opinions on what they witnessed in the course. Students would click "AGREE" at the bottom of the screen and proceed to fill out the evaluation.

That's it. Since we do have an honor code at this university, simply asking students to follow the above-mentioned rules should be enough--if it isn't, then why bother having an honor code in the first place? It seems to me that these are the exact types of situations the honor code was meant to cover.

Finally, for the faculty who don't think much of undergraduates, a special website could be added with the specific purpose of allowing the professor to respond to the posted comments. That way, the effects of a misled student's comments could be put in their proper context.

This was the kind of course evaluation system I wanted to see. You can very well imagine my disappointment when I saw the current system, whose only contribution to students is a lesson in how to make something appear a lot more useful than it really is.

Marko Djuranovic is a Trinity senior and former health & science editor of The Chronicle.

Discussion

Share and discuss “Making course evaluations work” on social media.