OECD Report: Puzzles To Solve (Part 2)

In this post, I will sketch out two puzzles that emerge from the OECD report, “Students, Computers, and Learning.” The first arises from the gap between high PISA test scores and low use of computers in school in particular countries. The second puzzle is trying to explain the inattention that media both mainstream (newspapers, magazines, network news) and side-stream (opinion and curated blogs, Twitter) has paid to this report.

Puzzle 1: Students from countries that score high on PISA in 2012 spend less time in school using computers than European and North American students.

International test comparisons have driven the past thirty years of school reform in the U.S. Doing poorly on international rankings has prodded reformers to call for U.S. students to copy Asian and Scandanavian countries in their language, math, and science lessons. The OECD report on computers in 60-plus countries’ schools, however, offers empirical data that raise serious questions about one sturdy pillar of U.S. school reform: more access to and use of high-tech devices and software will improve teaching and learning.

Consider that 15 and 16-year old students in Singapore, Korea, Japan, China (Hong-Kong and Shanghai),  have scored higher on PISA (first, second, third, fourth, and sixth) than the U.S. (twelfth) yet–this is one big “yet’–have less access to computers in their schools and spend less time in school on the Internet (pp.18- 22). Thus, the report concludes: “PISA results show no appreciable improvements in student achievement in reading, mathematics or science in the countries that had invested heavily in ICT for education” (p.15).

How come? Why the disparity in the above countries between access and use of computers in schools (all of the above countries have very high rates of computers in homes) and scores on PISA. No cause and effect do I suggest. This is a puzzling correlation that goes against the non-stop championing of school reformers who tout the virtues of getting more and more devices and software into U.S. classrooms. The OECD report does suggest one tantalizing (and possible) reason, however. Maybe, just maybe, the thinking and writing skills necessary to navigate the Internet and read with understanding web articles and documents, as the OECD report says, can be just as well taught in conventional lessons without use of tablets, laptops, and top-of-the-line software (pp. 15-16). The puzzle remains.

Puzzle 2: Media attention to the OECD report has been minimal, especially in high-tech rich areas.

The report appeared on September 13, 2015. “Warp speed” news in the 24/7 media cycle guaranteed immediate reference to the report. And a flurry of articles in U.S., European, and Asian news outlets appeared (see here, here, here, and here). Within days, the report had been picked up by bloggers and occasional tweeters. Many of the articles and news briefs leaned heavily on OECD press releases and statements in the document by Andreas Schleicher, Director of Education and Skills for OECD. In the U.S., national and regional newspapers and network TV stations ran pieces on the report (see here, here, and here).

In those areas of the U.S. where high-tech businesses are crucial parts of the economy (e.g., California’s Silicon Valley, Austin, Texas, Boston, Massachusetts) barely a passing reference to the OECD report. None at all (as of 9/22) appeared in news organizations in the San Jose-to-San Francisco corridor. Of course, it may be a matter of time–I scoured Google’s references to the OECD report for only 10 days after it appeared. In the face of the ever-hungry news cycle, however, if the OECD report went unnoticed after it appeared, chances that the report’s findings on computer access, use, and academic performance turning up later are slim, given the media imperative to produce fresh news hourly. There may well be analyses in magazines, journals, and the blogosphere that appear weeks or months later but after 10 days, the report will be stale and forgettable news.

Here’s what’s puzzling me: National coverage in the U.S. of the OECD report was spotty. While the Wall Street Journal, Los Angeles Times, and the Washington Post ran pieces on the report, The New York Times has not made reference to it. And in the nation’s hot spots for birthing hardware, software, and apps in northern California, Texas, and Boston, barely a mention. How come?

I can only speculate about the little attention that this eye-catching report on the connections between computer access, use, and performance has attracted at a moment in time in the U.S. when entrepreneurs and vendors promise efficient and effective management of resources and student improvement in reading, math, and science. Across the nation more and more school districts are spending scarce dollars on tablets, laptops, and software. My hunch is that the mindsets of high-tech entrepreneurs, vendors, media executives, foundation officials, and school district policymakers  contain deep-set beliefs in the power of technology to make fundamental changes in every sector of society, including schools. When occasional reports like the OECD one  appear that challenge the beliefs, it is occasionally noted but not taken seriously or simply ignored. Academics call this inability to absorb information running counter to one’s beliefs, “confirmation bias.” My hunch is that the OECD report has been largely dismissed by ever-scanning mainstream and side-stream media editors, journalists, and bloggers precisely because of this bias toward the power of computers and technology to whip schools into academic shape.

 

15 Comments

Filed under school reform policies, technology use

15 responses to “OECD Report: Puzzles To Solve (Part 2)

  1. Sandy

    I would like to add an additional thought to your comprehensive overview. Technology (laptops, tablets, etc) were not introduced into schools for the benefit of teaching and learning. They were for the benefit of administration which wanted efficient data gathering.

    Starting with Net Days and the movement to network all schools in late 1990s, the IT department runs the show. Teacher workstations were placed in classrooms for the minimal practice of increasing teacher efficiency but more importantly, for the gradebook program linked (clumsy in the first few years, but flawless now) to the student information system. Hooking the teacher workstation to a TV or projector wasn’t a consideration in the first iteration – it was the gradebook. Hard to believe but in 1997, the year of the first teacher stations in our high schools, not every classroom had a TV.

    Fast forward to today’s 1:1 initiative, the IT department is still responsible, not the Department of Instruction, for purchasing and implementing the devices, in deciding what software can be purchased, and which websites can be used. In addition, to address the need for more data, there is formative assessment, pre-, post-, and re-testing now at the student’s fingertips. Specialized software has been purchased for that. However, at no time has there been talk of an evaluation plan – or as we use here SMART goal – being attached to this initiative. We have yet to have a curriculum supervisor identify a technology solution for their curriculum that actually can be used as a means to test the efficacy of the 1:1 solution. Professional development for teachers: non-existent. It’s all about the data, not teaching and learning.

    • larrycuban

      I appreciate your specifics, Sandy, on how your district initiated and sustained technology over the years. The organizational split between technology and instruction in your district prompts me to ask whether such organizational splits occur elsewhere. Thanks again,Sandy.

  2. JoeN

    Confirmation bias is glaringly evident in the BBC report you provided a link to Larry. Read what the various opposing spokespeople say in it and it is like reading from a Bluffers Guide to Educational ICT. I have an article in Friday’s Times Educational Supplement in the UK about this. A serious, objective scrutiny of the way the entire educational technology industry functions, is long, long overdue.

    I think Naomi Baron’s research adds real substance to your “tantalising (and possible) reason.” She discusses the differences between reading via a screen and via print in this recent BBC “Word of Mouth” Radio 4 programme. Professor Baron provides the most lucid, insightful, convincing commentary on the discrepancy noted by the OECD and plenty of other researchers, I have heard.
    http://www.bbc.co.uk/programmes/b06bnq18

  3. Are we using a tool, the PISA test, that was not designed to measure the effects of technology on math and reading to measure the effects of technology on math and reading? Is that kosher? Just asking. I personally do not think technology does improve on these skills. It improves the use of technology which I feel is a desirable skill.

    • larrycuban

      Garth,
      The linkage is between academic performance in reading (items test”digital reading”) and use of computers in math lessons and use of those computers in classes as reported by students–that is how many minutes of use.

  4. Reblogged this on From experience to meaning… and commented:
    Larry Cuban sums up 2 puzzles, but actually there has been quite a lot of attention to this report in our media, both in Belgium as in the Netherlands and also in the UK I’ve noticed quite some press coverage. It’s true that there was seldom a reaction from politicians.

  5. Alice in PA

    I am also surprised at the lack of media coverage, but I am a little more jaded as to my guess at the reasons. First, this goes against the prevailing narrative about how the use of technology will transform our schools. This could then open the door to a different discussion as you hinted at in part 1 about the purpose of public schooling. As I have commented before, my school is in the process of implementing a BYOD or a 1-to-1 program. I am so very happy it is a process! This report should factor into any discussions.

    Second, there are some “disturbing” results in the test scores for the use of computers. The US scored above average! This again goes against the narrative.
    Cynical views, I know.

    • larrycuban

      Thanks, Alice, for the comment. I agree that these OECD results counter the current “success” story of technology in schools (and life as well).

  6. Pingback: Computers in education again- anything new under the sun? | bloghaunter

Leave a comment