
In teaching we regularly change our class structures and routines and we implement new “interventions” in hopes of changing classroom dynamics or reaching more students. I know that most of the time I make these decisions based upon anecdotal evidence, perhaps after glancing at a handful of exit tickets from my students or based upon how I “felt” the class went. Recently, though, I’m finding myself a little more hesitant when making a claim about my class. I require that my students support their claims with evidence, so why wouldn’t I also support mine with evidence?
Does “it felt like it went better” constitute as evidence to say that a particular intervention was successful? Increasingly I would argue that it doesn’t. I propose that we reframe the way we think about our science classes so that they parallel a research lab.
With our overwhelmingly busy teaching schedules, I don’t necessarily suggest that every claim is backed by a full research study (although considering methodology and forms of data collection is important). Rather, I am suggesting that we are mindful of the data we can collect and use to support claims about interventions or the efficacy of lessons. What are the questions we are asking? Namely, is a particular intervention successful or not and how do we define “success” in this particular context? How can we look at our classes from numerous angles in order to answer these questions? This can be as straightforward as analyzing student performance on one assessment question of interest. We can then disaggregate the data based upon conditions like student attendance or homework turn-in rate. Or we can disaggregate the data based upon student characteristics.
A couple of years ago I had a hunch that my female non-native English speakers were performing particularly well in my physics classes (this was especially exciting since female non-native English speakers are traditionally underrepresented in the field of science). I sought to make more concrete claims on the basis of student growth on the pre and post assessments in the course. This particular group of students did in fact demonstrate learning gains that were greater than the other three subgroups I analyzed (female and male native English speakers and male non-native English speakers). [In a future blog post I’ll share the details of this study.]
For now I’d like to emphasize the importance of us being science teachers and researchers. When we make claims about our class “going well” or a particular lesson being effective at “reaching students,” I encourage us to back up these claims with evidence. This can be so powerful when we are communicating with administrators, colleagues, parents, and students! How are you collecting evidence and supporting claims about your own instruction?
All comments must abide by the ChemEd X Comment Policy, are subject to review, and may be edited. Please allow one business day for your comment to be posted, if it is accepted.
Comments 2
True
Thanks, Shelly!
I have found myself doing more evidence based evaluation. It can seem overwhelming, but like you mentioned...we can choose just one or two test items to focus on if that is all we have time for. A colleague of mine is using PLICKERS in her classroom. It is an app that you can use on your smartphone or other device. Students hold up individualized barcodes for answers to multiple choice questions and after a quick scan of the room, she has real data to help her make informed decisions on the fly. She has clickers, but since students keep their barcodes handy every day, this saves her from having to distribute the clickers for just a few quick questions. I am going to try it for myself next year. I am always looking for easier ways to collect evidence. As you said, we are all very busy. i hope others will jump in and share how they collect data.
Deanna
We all need to think about this
Shelly, thanks for this! I'm really glad that our profession has begun to develop instruments around conceptual understanding that can help address our intuitions about how effective we are with actual data! I think that more HS chem teachers should be collecting the type of outputs you describe and share them in both informal and formal settings...maybe you could write up the study with your ELLs in JCE, for example! We need more articles from those outside of college and university settings that evaluate our teaching!
gregor