Confessions of a Luddite professor (Daniel Drezner)

Daniel W. Drezner is a professor of international politics at the Fletcher School of Law and Diplomacy at Tufts University. This post appeared in the Washington Post on April 28, 2016.

 

I had the good fortune on Wednesday to hear economist Robert Gordon talk about his magnum opus, “The Rise and Fall of American Economic Growth.” Gordon has a somber tale to tell. He argues that U.S. economic growth ain’t what it used to be, and that ain’t gonna change over the next 25 years. This is due to myriad headwinds such as demographic slowdowns, rising inequality, fiscal constraints, and — most important — the failure of newer technologies to jumpstart economic growth the way that the Second Industrial Revolution did.

It’s his last point — about the effect of information technology on productivity — that prompts so much fierce debate.  Economists are furiously debating whether the visible innovations in the information sector are leading to productivity advances that are going undetected in the current productivity statistics. On the one hand, the aggregate data suggests a serious productivity slowdown over the past decade. On the other hand, Google’s chief economist, Hal Varian, insists that “there is a lack of appreciation for what’s happening in Silicon Valley, because we don’t have a good way to measure it.”

Surely, there are sectors, such as higher education, in which technological innovations can yield significant productivity gains, right? All that talk about MOOCs and flipped classrooms and the like will make a difference in productivity, yes?

As an optimist, I’ve long resisted Gordon’s argument — but this is one area where I’m beginning to suspect that he’s right and Silicon Valley is wrong.

I’ve been teaching for close to 20 years now. During that time, the IT revolution has fundamentally transformed what I do on a day-to-day basis. It is massively easier for me to access data that helps inform my classes. The ability to use audio-visual methods to broadcast a video or audio to my students is much easier. I’ve Skyped in as a guest lecturer for numerous other colleagues. Course websites have made it far easier for me to communicate with my students, and for them to communicate with me. There is no denying that on some dimensions, technological change has made it much easier for me to do my day job.

And yet, over the past decade, I have also gone in a more Luddite direction. After having a laissez-faire policy on laptops in my classrooms for my first decade of teaching, I have pretty much banned them. I knew that taking notes by hand is much, much better for learning than taking notes on a computer (the latter allows the student to transcribe without thinking; the former forces the student to cognitively process what is worthy of note-taking and what is not), but I figured that was the student’s choice. The tipping point for me was research showing that open screens in a classroom distract students close to the screen. So I went all paternalistic and decided to eliminate them from my classroom. The effect was immediate — my students were more engaged with the material.

My lectures are pretty low-tech as well. I use videos in class on occasion, but I usually deploy them at the start and then start lecturing. Otherwise lights have to be dimmed and that’s an invitation for the student to tune out. Similarly, I don’t use Powerpoint for my notes — because that just invites the student to transcribe the points in the slide without thinking about them.

One could argue that Skyping in as a guest lecturer, or just broadcasting a superstar professor into other universities, could improve the quality of the classroom experience. But I doubt it. Speaking from experience, lecturing remotely is a radically imperfect substitute for interacting in the same physical space. A mediocre but in-the-flesh-professor still provides a superior education environment than a remote lecturer that one watches on a screen.

There has been one innovation over the past generation that has made my in-class teaching better. The whiteboard is way better than the blackboard to use. Otherwise, I have become warier of new technologies in the classroom.

Maybe this is just me being a Luddite, and, as digital natives, millennial professors will figure out how to properly exploit information technologies in the classroom. And outside of the classroom, I’m a pretty big fan of these new technologies.

But when it comes to higher education, I think Gordon is right and Varian is wrong. There are gains to be wrung from technological innovation — but they’re much more limited than Silicon Valley wants you to believe.

9 Comments

Filed under how teachers teach, technology use

9 responses to “Confessions of a Luddite professor (Daniel Drezner)

  1. Andy Felts

    Bravo!

    If you ask students to be honest about their behavior with their laptops, they will tell you that they find the temptation to check email or send an IM (via their handheld) irresistible. Though we can switch between tasks very quickly, losing some time to refocus our thoughts each time we do, we cannot multi-task and yet students think they can. I ban laptops and handhelds in my (college) classrooms and I cannot tell you how many times I have paused the entire classroom as I stare at a student who is staring at their handheld discreetly held in their lap. It sometimes takes what seems to be an agonizingly long time for them to realize I am staring at them–I then ask them to tell me what I was saying when I paused to wait on them and they of course cannot answer.

    From what we know of mirror neurons, when one student does something there is an empathetic response in those who observe and they ‘see’ themselves looking at their own handhelds.

    If Professor Drezner is referring to the whiteboard is a regular dry-erase board, then I agree. Otherwise the ‘use a special pen’ to write on the board (at least in my institution) is deficient. There is a noticeable (and annoying) lag between what I write and its appearance on the board and I am confined–I canot draw a long line from one side of the room to another to link concepts that I intend to explain by filling in the gap between with additonal concepts. The board sticks out, thus preventing students in front on each side from seeing the other side clearly.

    Our classrooms have become techno-centric rather than teaching-centric. The screen is in the middle and professor is relegated to a large console in the corner. Students are oriented to be screen watchers–so that is where their eyes go. My institution’s IT folks actually have a contest going where we are invited to “shout-out” colleagues who are using innovative teaching techniques–never mind whether the students are learning anything. Innovation (called “high-impact learning” here) is de facto good just because it more than likely employs technology.

    Finally, like Professor Drezner I do show short movies or clips in class–but also do so at the beginning regardless of where they might appropriately fit in the flow of teaching– We all experience glitches from time to time when we try to slide from teaching to a video, and even if not the time required to go and set it all up is mechanically disruptive to what I consider an organic flow.

    AAF

  2. I think this is a thoughtful approach that seeks to help students succeed — which is what all educators should be focused on, even when doing so challenges our innate beliefs. Whether we’re talking about new teaching methods or new teaching tools, we should always question and compare. We must also be willing to act on evidence, and that means overcoming our own biases (whether for and against). I always go back to Clark’s media debate, which (whether you agree with Clark or not) was a brilliant exercise in questioning the value of incorporating new technology by focusing on the most feasible benefit: cost / time savings. Technology tends to be really good at that.

    • larrycuban

      Thanks for the comment, Jared. Like yourself, Richard Clark’s argument continues to resonate with me.

  3. Trying to fit 1000 year old teaching methods (lecture, sage-on-the-stage, and so on) to 21st century tech is going to be a problem. Not that lecture, as old as the method is, is not an excellent teaching strategy but modern tech opens new teaching methods. As a 30 year teacher I still think lecture is one of the best methods of imparting knowledge to a large group at one location. As long as we continue to have one teacher trying to give their knowledge to a large group in a classroom situation tech is going to be a hindrance most of the time. In a situation where the teacher is directing learning (usually small groups) then tech becomes useful.

    PowerPoint is a great tool if used properly. One or two 2-3 word bullets per slide. The bullets are topic titles, not content. It is great for pictures, maps and discussion points. I teach a high school senior stats class. The textbook came with some very comprehensive Powerpoint slides for each chapter. For the reasons stated by the author I have stopped using them except for the pictures of graphs. Much better than my artistic renderings.

    We have a teacher in the middle school that uses a Smart board almost exclusively to teach math. What he does is great to watch. It has taken him about 3 years to refine the method to the point where it is better than what could be done on a regular board.

    Tech in the classroom, either in the hands of the students or with the teacher is very situationally dependent. When it is wrong it should not be made to fit. The local public high schools have put interactive boards in all the classrooms. 90% are unused. Round peg, square hole.

  4. Pingback: Gordon heeft gelijk: |

  5. Pingback: Gordon (niet te verwarren met) heeft gelijk: |

Leave a comment