Do We Know What History Students Learn? (Wineburg, Breakstone, and Smith)

“Sam Wineburg is the Margaret Jacks Professor of Education and of history (by courtesy) at Stanford University. Joel Breakstone is the executive director and Mark Smith is director of assessment at the Stanford History Education Group.”

This article appeared in Inside Higher Ed, April 3, 2018

“What are you going to do with that — teach?” Uttered with disdain, it’s a question history majors have been asked many times. Clio’s defenders have a response. The head of the American Historical Association says that the study of history creates critical thinkers who can “sift through substantial amounts of information, organize it, and make sense of it.” A university president asserts that the liberal arts endow students with the “features of the enlightened citizen” who possesses “informed convictions … and the capacity for courageous debate on the real issues.” Historians pride themselves on the evidence for their claims.

So, what’s the evidence?

Not much, actually. Historians aren’t great at tracking what students learn. Sometimes they even resent being asked. Recently, however, the winner of the Bancroft Prize, one of history’s most distinguished awards, washed the profession’s dirty laundry in public. The article’s title: “Five Reasons History Professors Suck at Assessment.”

Anne Hyde described what happened when accreditors asked her colleagues to document what students learned. They paid little heed to the requests — that is, until Colorado College’s history department flunked its review. Committed teachers all, her colleagues “had never conducted assessment in any conscious way beyond reporting departmental enrollment numbers and student grade point averages.”

Among many college history departments, this is routine. To address the issue of assessment, the American Historical Association in 2011 set out on a multiyear initiative to define what students should “be able to do at the end of the major.” Eight years, dozens of meetings and hundreds of disposable cups later, the Tuning Project produced a set of ambitious targets for student learning. But when it came to assessing these goals, they left a big question mark.

Which is one of the reasons why we were convinced of the need to create new assessments. With support from the Library of Congress, we came up with short tasks in which history students interpreted sources from the library’s collection and wrote a few sentences justifying their response. For example, one assessment, “The First Thanksgiving,” presented students with a painting from the beginning of the 20th century and asked if the image of lace-aproned Pilgrim women serving turkey to bare-chested Indians would help historians reconstruct what may have transpired in 1621 at the supposed feast between the Wampanoag and English settlers.

The_First_Thanksgiving_cph.3g04961.jpg

 

In the March issue of the Journal of American History, we describe what happened when we gave our assessments to students at two large state universities. On one campus, we quizzed mostly first-year students satisfying a distribution requirement. All but two of 57 ignored the 300-year time gap between the Thanksgiving painting and the event it depicts. Instead, they judged the painting on whether it matched their preconceptions, or simply took its contents at face value — an answer we dubbed the “picture’s worth a thousand words” response.

We weren’t terribly surprised. When we tested high school students on these tasks, they struggled, too, and many of these college students were in high school only months earlier. But what would happen, we wondered, if we gave our tasks to college juniors and seniors, the majority of whom were history majors and all of whom had taken five or more history courses? Would seasoned college students breeze through tasks originally designed for high school?

What we found shocked us. Only two in 49 juniors and seniors explained why it might be a problem to use a 20th-century painting to understand an event from the 17th century. Another one of our assessments presented students with excerpts from a soldier’s testimony before the 1902 Senate Committee investigating the war in the Philippines. We asked how the source provided evidence that “many Americans objected to the war.” Rather than considering what might prompt a congressional hearing, students mostly focused on the document’s content at the expense of its context. Rare were responses — only 7 percent — that tied the testimony to the circumstances of its delivery. As one student explained, “If there hadn’t been such a huge opposition by Americans to this war, I don’t believe that the investigation would have occurred.”

We suffer no illusions that our short exercises exhaust the range of critical thinking in history. What they do is provide a check on stirring pronouncements about the promised benefits of historical study. In an age of declining enrollments in history classes, soaring college debt and increased questions about what’s actually learned in college, feel-good bromides about critical thinking and enlightened citizenship won’t cut it. Historians offer evidence when they make claims about the past. Why should it be different when they make claims about what’s learned in their classrooms?

 

Advertisements

6 Comments

Filed under higher education, how teachers teach

6 responses to “Do We Know What History Students Learn? (Wineburg, Breakstone, and Smith)

  1. Laura H. Chapman

    “Instead, they judged the painting on whether it matched their preconceptions, or simply took its contents at face value — an answer we dubbed the “picture’s worth a thousand words” response.”

    In this case, the students were also uninformed about strategies for looking at artworks and thinking about other questions, such as for whom the images were made, when, and more generally the rudiments of art criticism. If you go to Google image search, many depictions of the “first Thanksgiving” can be found. I am thinking that some exercises in comparing and contrasting these images might tun up the eye and mind for the kind of historical understanding hoped for. Students still seem to think, or are taught, there is one and only one grand historical narrative (Euro-centric).

  2. David F

    Hi Larry—there’s an issue with this in that it presents a case for students to have developed generic skills within a discipline, which has issues. Michael Fordham in the UK has written on this pretty extensively. See here for starters: https://clioetcetera.com/2017/03/19/the-problem-with-general-ability-statements-in-history-education/

    • larrycuban

      Thanks for the comment, David, and the link to Fordham’s piece.

    • Chester Draws

      David. That’s precisely the point. The evidence for generic skills is low.

      That means History doesn’t really teach “thnking”. In which case, why are so many people studying History?

      We don’t need that many Historians. They aren’t learning anything much useful. The money and time they spend is being wasted.

      I love history. I have bookshelves full of history books. But studying History is a waste of time.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s