Listen to Danielle Ofri lamenting a fact she discovered about her work in New York City’s Bellevue hospital as a physician: “we often ignore good advice when it conflicts with what we’ve always done.”
Ofri was referring to the latest clinical guideline issued by the Society of General Internal Medicine that recommended against annual physical exams for healthy adults. The scientific evidence shows “the harm of annual visits — overdiagnosis, overtreatment, excess costs — can outweigh the benefits.” These guidelines become “best practices” for physicians to follow with patients; they are based upon analysis of many clinical studies.
Keep in mind that the body of evidence producing clinical guidelines for how doctors should practice is based on cumulative research and meta-analyses often involving tens of thousands of patients in control and experimental studies. “Evidence-based medicine”–even with all of the criticism of reversals in the advice doctors receive-is a reality at the fingertips of every doctor tapping keys and watching the computer screen as they take a patient’s history and conduct an annual exam.
Yet Ofri continues to have her patients return every year for an annual exam. How come?
She says: After the research was initially published last year, I grappled with the evidence, or lack thereof, reaching a conclusion that I mainly still supported the annual visit, if only because it establishes a solid doctor-patient relationship. But seeing these new, strongly worded recommendations, I may have to re-evaluate. At the very least, I should take a moment to think before I reflexively recommend the annual visit. But I know that I might still end up doing the same thing, despite the evidence.
She concludes: Humans are creatures of habit. Our default is to continue on the path we’ve always trod.
For some physicians, habit trumps evidence or what was once a “good” habit–annual exams for all of her patients–becomes a “bad” habit. True as well for K-12 teachers.
No such clinical research base, however, exists for recommending “best practices” in teaching reading, math, science, or history. Sure there are single studies, even groups of studies that point in a direction that teachers might consider in teaching long division or teaching six year-olds how to parse vowels and consonants. But for most teachers, “best practices” is a meld of what researchers say practitioners ought to do, what “experts” say should be done in classrooms, lessons learned from personal experiences in teaching, deeply-ingrained beliefs–call it ideology–about how best to teach and how students learn, and, yes, you guessed it: habit.
All of these ways of defining “best practice” for teachers came into play when I taught history to high school students many years ago. Let me explain.
In the fifth year of my teaching at Cleveland’s Glenville high school–it was the early 1960s–I had already introduced materials to my classes on what was then called “Negro history” (see here and here). I then began experimenting with the direct teaching of critical thinking skills. I believed that such skills were crucial in negotiating one’s way through life and understanding history. I wanted my students to acquire and use these skills every day. So I began teaching my U.S. history courses with a two-week unit on thinking skills. My theory was that the students learning these skills at the very beginning of the semester would then apply them when I began teaching units on the American Revolution, Immigration, Sectionalism and the Civil War, and the Industrial Revolution.
In the two-week unit, I selected skills I believed were important for understanding the past such as: figuring out the difference between a fact and opinion, making a hunch about what happened and sorting evidence that would support or contradict the hunch, judging how reliable a source of information is, distinguishing between relevant and irrelevant information in reaching a conclusion.
For each of these, I would go over the specific skill with the class and they and I would give examples from our daily lives, school events, and family happenings. Then, I chose a contemporary event–a criminal case in the local newspaper, a national scandal that was on television, and occurrences in the school–and wrote out a one-page story that would require each student to apply the particular skill we were discussing such as making an informed guess, collecting evidence to support their hunch, and reaching a judgment. I also gave the class additional sources from which they could (or could not because of biases) select information to support their conclusion.
For the two weeks, each period–I was teaching five classes a day at the time–was filled with engaged students participating in flurries of discussion, debates over evidence, student questioning of each others’ conclusions, and similar excitement. I was elated by the apparent success of my critical thinking skills unit.
After the two weeks of direct instruction of skills, I plunged into the Coming of the American Revolution and subsequent history material. From time to time, over the course of the semester, I would ask questions that I felt would prompt use of those thinking skills we had worked on earlier in the year. Blank stares from most students with occasional “Oh yeah” from others. I designed homework that explicitly called for use of these thinking skills; few students applied what they had presumably learned. I was thoroughly puzzled.
Which brings me to the concept of transfer. Why did students taught discrete thinking skills directly with a high degree of engagement and apparent learning for two weeks have a difficult time transferring those very same skills to history lessons later in the semester? I take up this issue and my “bad” habit in the next post.