In the name of reflection, growth, and professional development, understanding how your students perceive you as their teacher and leader is important. Trying to see yourself the way that your students see you, is a humbling experience, and it forces you to reflect and be constructively critical of yourself and your practice. One way that I try to do this in my practice is by giving Student Perception Surveys.
What Are Student Perception Surveys
As a graduate student, we were asked to give a student perception survey. The survey was designed by Tripod Education Partners, and was intended collect data on how students perceived their teacher, and how the teacher/class made them feel. Some example questions are:
- My classmates behave the way my teacher wants them to.
- This class does not keep my attention--I get bored.
- My teacher takes the time to summarize what we learn each day.
- In this class, we learn a lot almost every day.
- My teacher seems to know if something is bothering me.
I'm not making a pitch for this particular Tripod Survey they made us give. Tripod doesn't know I'm writing this post about their survey (I hope they don't mind), and I won't post my survey or results because it's technically a private product that someone has to pay for. IDK, maybe you can convince your admin/district to sponsor your school or district. Or you can make your own, or maybe find a free one online. If you do, let me know because I'd love to see what kind of items you put on your survey.
Example of How It Can Help--A Personal Case Study
Last year, I taught two sections of Integrated Math 1 for 9th graders, and two sections of a Discrete Math elective for juniors. As I was looking at the data after the first survey, I was particularly surprised by student responses to the statement: "My teacher seems to know if something is bothering me." Fewer than 50% of students agreed with this statement. I was discouraged by this response, because I try to put a lot of work and thought into making sure that my students feel emotionally attended to, and it was something I thought I did well. After seeing this data, however, I tried to incorporate more check-in questions on my Do Nows asking students how they're doing.
I also modified one of my quick check-in routines that I use. At some point during a class, when I'm scanning the room, or students are taking a quiz, or working independently, any time I see a student who has their head down (in a "sad" way), or who looks worried about something, or if they're even just "spacing out," I try to check in with them non-verbally. I do this by making eye contact with them, and then giving them a thumbs up, sideways, and down, non-verbally asking them how they're doing. This gives them a chance to simply respond with the appropriate thumbs-up/sideways/down.
Before, I considered the check-in itself to be the primary means of attending to the student. Letting them know that I see them, and want to see if they're okay, especially if it looks like they're struggling. In an effort to be more attentive to students who might be struggling, now if a student gives me anything but a thumbs-up in return, I'll go kneel next to them and check in with them privately, asking what's up and if there's anything I can do to help.
After these two modifications, I saw a moderate improvement in teacher-favorable responses, up from 49% to 55%. This is still much lower than I was hoping for, and this area continued to be a weakness for me after the third administration of the survey. Nevertheless, in general, the survey directed me towards a blind-spot I had, which I appreciated, even as it continued to challenge me.
Things I Have Done In The Past
- Give the survey multiple times over the course of the year. I typically give the survey three times, once each in November, February, and May. This give students some time at the beginning of the year to cultivate some feelings about me, and gives me some time after each survey to reflect and try to improve things.
- Divorce the survey from big tests/grades. It's super tempting to give the survey after a big test, or at the end of each quarter. But this might bias the data by drawing a connection between how some test just went, or what letter you sent home on a report card. And conversely, it also increases the temptation for teachers to tweak grades or routines in an effort to inflate their own survey results.
- Make it a normal day. Like I mentioned above, if you want the survey to be an accurate and honest assessment, you don't want to make a big speech beforehand, or do it the day after a field trip, or some other random day.
- Make sure students know the responses are anonymous. Data could be biased in either direction if students thought you could see who responded what. Also, especially if you have smaller groups of students, don't play the game where you look at the data and try to guess whose responses were whose.
- Offer the survey in native languages. My first year teaching, I had a section where almost the entire class had moved to the country in the past year. I wanted to assess their perceptions of me, not their language. So I made translations of the survey. Yes, this might have introduced new biases due to imperfect translations, but I figured it was worth it.
- Give the survey with a team of teachers. Find a team of like-minded colleagues in your building, and form a working group. You can all give the survey to your students around the same time. Then you can reflect on the data and experience together. You don't have to share all your data with each other--it is pretty sensitive and personal data. But you can if you feel comfortable--it's an intense exercise in professional vulnerability. Especially if the results hit a sensitive nerve, like mine did in the personal case study I gave above. You can use the data as a starting point for discussing problems of practice, or brainstorming ways to respond to the data. You can also help each other stay accountable to the efforts you make to respond to the data.
- Share some of the results with the students. It would probably be overwhelming to share all of the data with your students, but maybe you can share an interesting or challenging cluster of data, and ask students what they think. If you're feeling really stuck on how to respond, this can help you to brainstorm ideas. It also messages to the students that you are actually attending to the data, and care about what they have to say. If you do this, make sure it comes from a position of you accepting responsibility for this, and seeking their support and input. It must not be about you blaming them or trying to justify why their responses were not favorable to you.
I am one of the advisors for my school's student government. Every year, students share that they are frustrated that they have little to no input on teacher evaluations. They are certainly the people most proximal and most affected by the quality of the teacher. And they often cite the fact that student evaluations are common at the college level. I always see their point, and every year respond with how complicated and litigious the teacher evaluation process is. So from an administrative perspective, making student perception surveys a mandatory part of the teacher evaluation process is basically impossible, and definitely problematic.
Regardless, I voluntarily share my survey data, and my reflections on it, with my administrator as a part of my annual professional evaluation. It makes for a great artifact, I gotta say. But, I also trust my administrator to look at the data holistically, in context, and with an eye for me trying to be reflective. I trust that my administrator will not use the data to penalize me for the areas in which I need to grow. I know that not all teachers are privileged to have this kind of evaluator. Share the data with those who you feel safe doing so.
It would be inappropriate for an administrator to evaluate a teacher, or for teachers to evaluate each other, based solely on the raw data from this kind of survey. All this data exists in a context from which it can/should not be extricated. As such, it would also be inappropriate to use this data to compare teachers with each other, even those with the same students. The degree to which an administrator attempts to do this, is the degree to which they incentivize a teacher to game the survey, and is the same degree to which they erase the value of the survey as a professional exercise.
Indeed, it is very common for teachers to avoid these kinds of surveys for evaluative purposes, because they are concerned that the surveys end up as more of a "popularity" or "likability" assessment, which are not necessarily meaningful assessments of the quality of a teacher. It is impossible to avoid this bias, true. Nevertheless, I claim that students are much better at divorcing their student experience from their personal experience than we give them credit for. So we can feel comfortable giving this kind of data some real professional credence.
Regardless, and this is more of a personal opinion that not all teachers have to hold, I DO think that it is important for teachers to have a modicum of accessibility, likability, and collegiality. Despite the popular belief that students like "easy" teachers, I do think that students also find it easy to "like" the teacher that teaches them respectfully and well. Kids WANT to learn. Kids WANT to work hard. And kinds appreciate teachers who help them to do that.
Long story short, yes, a student perception survey will be biased in some way by whether or not students "like" you. No, that does not invalidate the professional utility of the data. Yes, it does complicate it. And if you want to invite this data into your professional evaluation, that's your professional prerogative to do so, recognizing that the evaluative focus should be on your reflection on the data, and your response to it, instead of raw performance.
If you end up putting together a working group at your school to develop or use some kind of student perception survey, let me know! I'd love to hear how it goes, what your assessment looks like, and how you or your team reflect on, and respond to the data.