Undergraduate researchers compare content on RateMyProfessors.com to the end-of-semester USF student evaluations.
TAMPA, Fla. (May 27, 2011) – Every semester, students across the globe log into RateMyProfessors.com (RMP) to get an idea of how challenging a class or professor is, to see whether it’s worth buying the book and to get tips and tricks from former students on how to be successful in the course.
The University of South Florida, like many universities, has its own program where students are asked to evaluate instructors. The program is called Student Evaluation of Instruction (SEI) and includes rating categories, and an area for comments.
New undergraduate research at USF has shown that there is definite convergence between the two methods of evaluation. Two undergraduate students, Adam Bergin and Austin Gower, have been working with graduate mentors and assistant professor Santiago Sandi-Urena to research both qualitative and quantitative aspects of RMP and the SEI to see if there is any validity to the comments students post on RMP.
The students looked at about 60 comments on RMP from the General Chemistry I and II and Intro to Chemistry classes at USF.
“We came up with a tool to evaluate instruction, specifically introductory chemistry, using RateMyProfessors and the Student Evaluations of Instruction,” said Bergin, a biomedical sciences student. “We take student comments about the instruction in the specific classes and we use what students find as important in regard to evaluating classroom instruction. Then we came up with a tool that categorizes the qualitative comments.”
Bergin said that a lot of people disregard RMP as a tool for legitimate evaluation of instruction.
“A lot of people come in with opinions and think something’s not valid or something doesn’t really work,” Bergin said. “There’s a lot of assumptions about Rate My Professors and from previous research and our own, a lot of it’s not true.”
The team wants to make something clear though - they’re not trying to validate or invalidate the website. They’re analyzing the content on the site to see what students consider to be important aspects in professors and classrooms.
“We wanted to look at the content on the site, that rich information, to understand what’s going on in the classrooms to improve our teaching,” Sandi-Urena said. “We want to validate not the site but the information given by the students at our institution.”
Sandi-Urena said there is a lot of controversy nationwide about student evaluations. A lot of people do not believe in generic evaluations. He said that ratings in one department might be good and the same rating bad in another department so the departments are assessing different things despite having students complete the same form.
The team mused about summative evaluations - evaluations done at the end of the term - versus sites like RMP that allow students to log in and contribute information to their peers as soon as after the first class and throughout the semester.
“Students are the most important participants in this process of leaning and teaching,” Sandi-Urena said. “But students aren’t invested in evaluations because it’s the end of the semester and their ratings and comments won’t change their experience.”
Professors who look at their own evaluations throughout the semester could get an idea of what’s working for their students and what they could improve on. Sandi-Urena said some of professors informally ask students to give feedback throughout the semester, but it’s not institutionalized.
Gower said that the research team is developing a framework for better evaluations in the future not only for the chemistry department’s introductory classes, but on a university-wide scale. Sandi-Urena said that, ideally, evaluations would be specific to departments and classes.
“It seems we should update the Student Evaluation of Instruction to make it relevant to the students and effectively appropriate,” Sandi-Urena said.
What’s meant by effectively appropriate? Students should consider things like subject matter when writing their reviews. A student might post “This class is really hard but it’s Organic Chemistry so it’s to be expected.”
Students are also writing to their peers on RMP, not to the professors or university officials that view the SEI’s. Some USF students question whether the SEI evaluations, due to the large student population, ever get read or considered.
RMP is frequently regarded as a site where students go to rant and rave about professors, Sandi-Urena said. The site even addresses that in its Frequently Asked Questions section, saying that “you might be surprised to learn over 65% of the ratings are positive.
What the team found was that students who contribute to RMP are no different than students that don’t. This means that the site isn’t attracting students who want to slander professors, but a variety of people, most who want to offer honest feedback and advice to their peers.
Seven categories emerged from RMP: nature of the course, nature of the subject, student responsibility, quality of the instructor, instruction, suggestions to other students and instructor traits. Sandi-Urena says that the SEI’s currently ignore much of what is important to students.
On the other hand, there is convergence between the two methods of evaluation. Professors who consistently received a high number of positive evaluations on the SEI’s also received a high number of positive evaluations on RMP. Professors who scored lower on one also scored lower on the other.
Gowers, a public health student, said their research is “to help students better look at the professors and for professors themselves to look back at what they did wrong and could improve on. It is in no way set up to judge the professors so that schools would fire them.”
The team is working towards publication in the Journal of Chemical Education and is further investigating who is contributing to RMP to establish comparisons among RMP contributors, RMP users, and the first year chemistry student.
Daylina Miller can be reached at 813-500-8754.