The Twelve Days of Performance Review

It’s that time of year—the time when faculty members at Waterloo start thinking about writing their annual performance review documents and putting together their files. In this spirit, the FAUW Equity Committee offers twelve tips to help you think about equity as an essential part of this process.

A title card with mildly festive graphics: holly, mistletoe, etc.

On the first day of performance review season, collaborate with members of your own department to demystify the review process, especially for new faculty members. All APRs are local; what someone does in another department is probably not the same in yours. Consider starting a sharing circle: pool APR reports, with or without the numbers attached, so that you can get a feel for the genre. Pay it forward. Mentor those junior to you.

On the second day of performance review season, focus on your teaching effectiveness in the full knowledge that student questionnaires correlate principally with non-instructional factors (scheduling, student interest in the topic, grade expectations, and the like).

The Ontario Confederation of University Faculty Associations (OCUFA) states unequivocally:

“Using SQCTs [student questionnaires on courses and teaching] for performance evaluation penalizes women, racialized and LGBTQ2S+ faculty, and faculty with disabilities. These faculty are also more likely to be the target of harassment in the anonymous comments sections of the questionnaires. Further, using SQCTs for performance evaluation risks undermining effective teaching and intellectual diversity.”

To cite this report in your own performance review documentation:
Ontario Confederation of University Faculty Associations. “Report of the OCUFA Student Questionnaires on Courses and Teaching Working Group.” February 2019. https://ocufa.on.ca/assets/OCUFA-SQCT-Report.pdf

On the third day of performance review season, start a broader discussion at the department level. Every department has guiding documents that outline how to evaluate performance. Share the recent arbitration ruling at Ryerson University (Ryerson University v. The Ryerson Faculty Association, 2018) that student evaluations of teaching via course questionnaires are valuable instruments for “captur[ing] student experience” but cannot be “used to measure teaching effectiveness for promotion or tenure.”

Or talk about how the Department of Psychology at Waterloo decided not to use student evaluations of teaching in their review process, citing the bias inherent in these evaluations. This effort was rejected at the decanal level, but maybe we just need more departments to take a principled stand. Consider citing the OCUFA document (see: day two) in your department’s review documentation. Ask what other sources of bias might exist in your department’s process.

On the fourth day of performance review season, your department gets a special gift: a junior faculty member on the performance review committee! Rotating junior members of the department onto the committee is important because it will pull back the curtain for these colleagues, but also because it can be unfair to junior faculty members to be evaluated only by senior colleagues. The Memorandum of Agreement (MoA) between FAUW and the University delineates that there must be five members to assist the Chair on this committee (see section 13.5.6), but does not specify rank or other details about these members.

On the fifth day of performance review season, take a moment to think about how you may benefit from inherent biases in the evaluation system. This could be because of your race, gender, or some other aspect of your lived experience. If you believe that teaching evaluation instruments or other parts of the review process are biased, say so. Your voice really matters. You might even consider including a statement in your own review, acknowledging that you might benefit from this bias. The Department of English, for example, allows members to include a footnote in which they acknowledge the bias inherent in student questionnaires.

Also consider sharing your summaries and other materials with members of your department. Finally, is there gender balance and diversity on the committees in your department? If not, you can advocate for this.

On the sixth day of performance review season, why not have a “transparency meeting” in your department that discloses preferences and assumptions among faculty? For example, Communication Arts holds a “tacit knowledge meeting” to get everyone on the same page about how things work. In one such meeting, senior faculty noted that some people list department meetings as part of service but that doesn’t carry far in determining service scores. They recommended that faculty not include that (it looked like padding the service area). They also noted that this might be something junior faculty tended to do, especially early in their Waterloo career.

The department also workshopped a fake performance review document and discovered how and why each person would rank the sample the way they did. At the transparency meeting, make sure that everyone, including the committee members, really understands the Faculty guidelines and departmental addendum or other instructions provided to faculty. It can be frustrating to follow the directions and then find out the committee didn’t understand the process!

On the seventh day of performance review season, take a break.

On the eighth day of performance review season, imagine if there were no numerical scores in this process at all! Imagine that! A few years ago, Renison moved from a system that rated faculty on a five-point scale (from unsatisfactory to outstanding) to a new system that simply distinguished between “unsatisfactory” and “satisfactory” performance. Renison faculty came from so many different disciplines that it was difficult to compare research contributions, and there was concern that cross-disciplinary misunderstandings could exacerbate existing bias. Now, Renison faculty receive narrative evaluations along with a simple satisfactory or unsatisfactory rating.

On the ninth day of performance review season, consider the tone of your report: Write it as if you are writing a letter in support of your strongest student. Be fulsome in describing your own activities and how they are meaningful. No one is going to fill in the adverbs or adjectives you think you’re implying, or go out of their way to understand the impact of your work if you don’t show them. Committees are harried and they are (honestly) looking to keep the average scores average. Don’t make it easy to mark you down.

Be fulsome in describing your own activities and how they are meaningful. No one is going to go out of their way to understand the impact of your work.

On the tenth day of performance review season, volunteer for the assessment committee in your department. It really clarifies things to see how the assessment works from the inside. Whether the committee already has gender balance, diversity, and a distribution across ranks, or you can add to these things: volunteer!

On the eleventh day of performance review season, make sure that your kind of excellence can be accounted for in department or unit practice. If you do a lot of policy work, where does that fit and how does it count? Do a lot of ‘public intellectual’ work? Make a case for where it goes on the report, and that it should be part of your assessment.

On the twelfth and final day of performance review season, consider writing down a list of your accomplishments in 2019—the ones that have nothing to do with service, teaching, research, or your job at all. This list could include the things you are glad you said no to. This list is probably much more important, and much more fun to create, than your annual performance review.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.