Skip to content Skip to navigation

Compared to Other Countries, Does the United States Really Do That Badly in Math?

Eric A. Hanushek, Paul E. Peterson
Published Date: 
July 12, 2010
Education Next

Many Americans were shocked to learn how poorly U. S. students were doing when the Program on International Student Assessment (PISA) released its study of math achievement for 2006. U. S. 15-year-olds came in 35th among the 57 nations who participated in its administration. The U. S. average score was 474 points (against an average of 500 for students in the industrialized countries that have been accepted as members of the Organizations of Economic Co-operation and Development (OECD), PISA’s sponsor).

But educators were encouraged in December 2008 when another respected international survey, Trends in Mathematics and Science Study (TIMSS), released results from its math testing for 2007. It found U. S. 8th graders to be ranked Number 9 among the 48 participating countries, and its score, at 508, was above the average for all students from participating countries. Furthermore, there are those such as Tom Loveless at the Brookings Institution, who has claimed that TIMSS does a better job of measuring math knowledge than PISA does. (Mark Schneider took a close look at both tests in this 2009 article for Education Next.) More than one commentator took these facts to argue that the problems of the American schools had been exaggerated.

Have we unfairly maligned our schools?

To figure out why the two tests seemed to point in somewhat different directions, we decided to take a careful look at the facts. Specifically, we looked at the countries who participated in the PISA test but not TIMSS, and vice versa. As can be seen at the bottom of this post, fully 22 of the countries that outperformed us on PISA in 2006 simply did not participate in the TIMSS testing. Basically, they include a large chunk of the industrialized OECD countries that are the ordinary reference group for the United States, along with a smattering of developing countries that also do better than us in math.

It is true that students in Singapore, one of the world’s hotbeds of math knowledge, took the 2007 TIMSS but not the 2006 PISA, but otherwise the countries who took the TIMSS but not the PISA come from the developing world. Further, the TIMSS average, calculated in 1995, was based on results that included scores from 12 developing countries.

In short, those who defend the U. S. performance by pointing to the TIMSS are making the compelling claim that the United States is just a bit better than an average score that excludes many top performers but incorporates results from many second and third world countries.