What Do Corporate Earnings Reports and School Test Scores Have in Common?

The answer is that intense pressures upon corporate executives to satisfy investors and upon school leaders to raise test scores distort routine practices and too often leads to chicanery. Over the last decade, stories have emerged from multi-billion dollar corporate offices that CEOs doctored earnings reports to keep investors happy (while protecting their stock options), and educators fiddled with standardized test scores.

Before the 2008 financial meltdown, earnings statements (and forecasts) as signs of corporate success pushed corporate officers to claim as earnings funds that had little to do with actual transactions with customers in a given year. For example, Computer Associates and Xerox CEOs claimed revenues in one year that their customers were actually paying them over three years. In some cases, the chicanery was so blatant that it became criminal. Bending flexible accounting rules too far sent Kenneth Kozlowski of Tyco  and John Rigas of Adelphi to jail. The collapse of these corporations  destroyed the lives of employees and investors.

The obsession with earnings in a given year even now in the midst of slow job growth, comes from the hard-core belief that these numbers signal the quality of a firm’s performance. Yet economists and financial analysts say repeatedly that one quarter’s earnings do not predict the next quarter’s profits. Nonetheless, earnings reports have become the single marker that has convinced investors that CEOs and their chief financial officers have created value for them and shareholders.

What’s the connection between earnings reports and use of standardized test scores to judge school performance? In the past two decades, business-oriented reformers have pressed schools to set standards, be accountable for results, and use test scores as measures of performance. To prod superintendents, renamed CEOs, to produce higher scores, especially in big cities, bonuses go to school chiefs who meet and exceed their targets of improved student achievement. Scores from standardized tests have also become vital in promoting (and flunking) students and awarding (and denying) diplomas, a factor in evaluating teachers and principals, and ranking schools for receiving cash awards or penalties for unsatisfactory performance.

By narrowing school quality to standardized test scores and ratcheting up the consequences for poor results, many superintendents, principals, and teachers have devised short-term strategies to secure the benefits and avoid the costs. They work with those students who could make large gains on the tests rather than work with the lowest performing students. They increase test preparation. They drop electives that subtract time away from those skills and knowledge that will be tested. When scores rise, have students learned more? Hardly.  But the adults learn a lot about bending rules to beat the system.

Just as earnings statements are too narrow a measure of corporate performance, test scores barely cover what students are expected to learn in schools. Civic engagement, knowledge of the humanities, building moral character, working in teams, critical thinking, and independent decision making– historic aims of public schools–are missing from standardized tests.

Moreover, if earnings reports mislead investors as to the actual worth of the firm, standardized test scores mislead parents about the actual performance of their children. There is, for example, no substantial body of evidence demonstrating that students with high scores on standardized achievement tests will do well in college or perform well in the workplace. Sure, graduating from high school and college will mean that you earn more over a lifetime than a dropout. But those earnings derive from the credentials, not from scores on standardized tests.

For those that relish irony, here is a delicious example of business leaders pushing onto schools narrow and misleading measures of student performance while harboring their own narrow and deceptive measures. But irony is cheap. What is expensive is the cost of relying on a single measure to judge complex institutional performance and the life-affecting consequences for those in the private and public sectors.

Of course, it doesn’t have to be this way. A few wise corporate leaders, lonely to be sure in the recent stock market frenzy, have said publicly that concentrating on annual growth in earnings will lead to short-term higher stock prices and weakened long-term performance. Not until former corporate leaders were indicted,  handcuffed, and put behind bars, however, did investors take note.

In education, prospects for leaders to  raise their voices are dim. When the President of the United States and the U.S. Congress still want every child between the ages of 9 and 14 to be tested annually, it will hard for educators to stick their necks out and say that such an obsession with testing, like the corporate obsession with raising stock prices and gaming annual earnings reports, is short-changing students and distorting the mission of public schools in a democracy. Yet that is exactly what is happening.

11 Comments

Filed under Reforming schools

11 responses to “What Do Corporate Earnings Reports and School Test Scores Have in Common?

  1. An excellent analogy. Often when I suggest that school district officials may manipulate data to make it look as good as possible, some people look at me like I’m cruel or unforgiving. But, I always reply with, “it’s simply economics.” For, school executives, like corporate leaders, are incented to do so. And, let’s not forget, these school executives have high paying jobs they don’t want to lose.

  2. Pingback: Larry Cuban: What Do Corporate Earnings Reports and School Test Scores Have in Common? « CarmenK12

  3. David Quattrone

    Yes, annual gains on standardized test scores obscure long term value added, just as quarterly earnings fail to reveal underlying weaknesses (or strengths). But with RTTT I fear the horse has left the barn.

    The background Katz article referred to in the blog is instructive. One quotation: “It may be more sensible, from the perspectives of both short-term stock price stability and long-term enterprise value, for companies to focus on communicating with investors and analysts by providing more detailed information regarding performance, strategy, sustainability, risks, key developments, and other long-term variables and value drivers.”

    So the challenge for us is not simply to rail against the short-term measures, but figure out what “forward looking metrics” make sense.

  4. Tony

    In addition, corporate metrics lead executives to manage toward accounting profit, not economic profit (which is in the shareholder’s best interest).

    …But while the system is flawed, without it, there would be absolutely no accountability for management, and investors would be much more hesitant to invest in companies. Capital markets are built such that if there were a better way to hold companies accountable without losing robustness of the metric, there would be tremendous pressure from institutional investors to get access to these metrics (think OBP vs BA in Moneyball). It’s misleading to suggest that investors are using only next quarter’s accounting earning forecasts to predict stock price – if they did, they would lose their jobs very quickly to people who were able to get a more accurate picture by blending a number of different indicators.

    In addition, there is absolutely a correlation between performance on SATs and success in the workplace, which is why many employers require applicants to include this information on their resume. Top consulting firms are widely known to place high value on GMAT scores from business school applicants.

    Ultimately, the answer is not to scrap test scores or initiatives that build the infrastructure necessary to hold teachers and principals accountable for student outcomes, but rather to design a better test that is both robust (hard to juke) and more accurately captures a student’s career success.

    We’ll see how successful this initiative is:

  5. Cal

    Yes, I don’t know what you’re talking about when you say there’s no correlation between test scores and college.

    There is a direct correlation between SAT/ACT scores and remedial coursework–if you have a high enough score, you aren’t assessed for remediation. A low SAT score is highly correlated to likelihood of remediation, and remediation is strongly correlated with failure to graduate.

    Also, CALPass just did research in the last decade showing a moderate to strong correlation between STAR test scores and likelihood of college success.

    • larrycuban

      Hi,
      Here is what I said: “There is, for example, no substantial body of evidence demonstrating that students with high scores on standardized achievement tests will do well in college or perform well in the workplace.”

      As you know, there has been and continues to be much criticism of using SAT I and II scores for college admissions. Continued debate plus the dropping of SATs for college admission at many universities and college underscore the point I made in the post. I re-read a recent study (“The SAT as a Predictor of College Success: Evidence from a Selective University” by Kevin Rask and Jill Tiefenthaler (2009) that summarizes the evidence well. If you can supply cites that I could look at for CALPass, I would be more than willing to change the above sentence. Larry

  6. Chan Bliss

    As educators we should embrace the business model for our school. Joining our voices with those in the corporate boardroom crying for deregulation.
    Okay, I don’t really believe this but if you carry the thoughts to the logical conclusion…

  7. Cal

    The SAT may or may not be valid for college admissions, but I was referring to what happens after admission. Cal State and UCs (as well as most state universities) use SAT scores for remedial assessment. If you have a 550 or higher on the SAT, then you are given an automatic pass on remedial courses at CSU. If you have lower than a 550, you have to take the CSU proficiency test (EAP, I think) and pass it. I can’t remember when I saw the numbers, but I know the CSU benchmarks the tests frequently, and there’s a strong correlation between SAT score (below 550) and proficiency test scores.

    In other words, if you have lower than a 550 on SAT math or verbal, there’s a strong likelihood you will end up in remedial coursework at CSU, and if you have lower than a 450 you can count it a near certainty.

    UCs give you a pass out of the math placement test with a 700, I think, and it’s a 660 for the English essay test, although last I checked they were debating whether to lower the standard because they don’t get paid for remedial courses.

    As far as I know, there is no research on SAT that differentiates between GPA in remedial courses and GPA in non-remedial courses which, given how many UC and CSU students are in remediation, makes the declaration about “no predictive value” pretty much of a joke.

    However, you don’t need research. You just need to look at their published SAT/ACT required scores to know that there is a strong correlation between SAT score and remedial coursework. Those test score requirements aren’t for show.

    CALPass: http://www.calpass.org/currentreports/CSTExecSummary-07152008.pdf

    “Overall, 11th grade math CST scores were better predictors than class grades of both the level of and grade in the first attempted community college math course. The study found a moderately strong correlation between scores for most forms of the math CST and college course levels and grades. In English, CST scores were moderately strong predictors of the level of the first attempted college English course.”

    It’s simply not true that the SAT has no meaning. Leave aside admissions, and it’s clear that the vast majority of state schools (and many private ones) see the SAT/ACT as a reliable indicator of student readiness for college work. Look at a students’ scores and you may not know whether or not he goes to college or what his grades are, but you do know whether or not the college will consider him ready to start in credit-bearing courses.

    The CSU system is the only one that uses the CST in any way, but I’m very familiar with the CST tests in math, English and history (have credentials in all three) as well as most college admissions tests, and I am reasonably certain that CST scores accurately predict the need for remediation.

    You’re at Stanford, right? I imagine you know Michael Kirst. Have you discussed remediation with him?

  8. Cal

    Oops–forgot to mention, but I think you know that remediation is a strong predictor of problems and failure to graduate in college. So if the SAT predicts remediation, and remediation predicts difficulty graduating from college, then a high SAT score predicts better college outcomes, thus “doing well in college”. A high SAT score makes a lot of college barriers disappear.

    I imagine you were thinking of the difference between a 600 and a 720 on any section, but about 10-20% of UC admissions have under 500 on any one section, and of course the CSUs have many students who are below 500 on any one section.

    As for success in the workplace, there’s little evidence that a 1000 total SAT score (three tests) will even get you in the door of law school, med school, or any master’s program outside of education, so certainly, high SAT scores predict a great deal of workplace success. Naturally, they aren’t the only predictor, and the higher the score, the more that other factors come into play. But I don’t think you’d find much support for arguing that a 2000 SAT (three tests) doesn’t predict professional and economic success more than a 1000 SAT.

    • larrycuban

      Thanks for the detailed and thoughtful comments on statements I made about links between standardized tests and college success. I learned a great deal about the UC and CSU systems’ use of test scores for remediation after admission. Larry Cuban

      • Cal

        My pleasure. Thanks for reading.

        I looked up the test names–they are ELM (math) and ELA (reading). The early assessment test is EAP.

Leave a comment