In a first, 15-year-olds shine in reading, science

Nation’s kids top fields in PISA test


Staff Writer

For the first time ever, Japanese 15-year-olds topped the list in reading and science performance in an international academic survey last year covering 34 developed countries, according to data released Tuesday by the Organization for Economic Cooperation and Development.

Japanese students also performed second-best in math in the triennial study, the Program for International Student Assessment, or PISA, among the 34 OECD member nations.

The scores in every category were all-time highs, indicating scholastic performance is improving, education ministry officials said. In 2009, Japanese students were second in science, fourth in math and fifth in reading.

The latest survey, conducted last year, sampled about 510,000 students in 65 countries and regions, including 34 OECD member countries. In Japan, some 6,400 students were randomly selected for the survey.

“It is the first time for our country to top the rankings in reading and science performance among OECD countries in this survey,” education minister Hakubun Shimomura said Tuesday.

“Japanese students achieved the best showing ever,” he said, adding that the percentage of students showing lower proficiency in the three fields decreased, while the percentages at the higher levels increased.

Tetsuya Kishimoto, an official at the ministry’s elementary and secondary education bureau, credited efforts by the ministry and others to improve academic skills and the educational environment, including the curriculum guideline change in fiscal 2009 that expanded science and math education at junior high schools.

Among the 65 countries and regions surveyed, Japan rose to seventh from ninth place in math, and to fourth in reading and science, up from eighth and fifth respectively.

Shanghai bagged top marks in all three categories. Singapore was second in math, followed by Hong Kong.

Hong Kong was second in both reading and science, while Singapore finished third in the two categories.

Paris-based OECD launched the PISA in 2000 to address the demands by member nations for “regular and reliable data on the knowledge and skills of their students and the performance of their education systems.”

Past results spurred the government to revise its “relaxed and lighter curriculum” education policy, spurred by the so-called PISA shock from the 2003 poll that saw Japanese students drop to sixth place from the top spot in math and from eighth to 14th in reading.

  • bigbamboo

    “But, because results are based on a sample of students, its relative position could be between 2 and 3 in mathematics, between 1 and 2 in reading, and between 1 and 3 in science.” Quoted from the Key Findings for Japan in the PISA 2012 report. It is one thing to be proud of our students’ achievements, and quite another to be trumpeting an incomplete story. It would have been more useful for readers to provide a link to the PISA 2012 results for Japan, then we could make up our own minds as to whether the nation’s students top fields or not.

  • Bruce Chatwin

    “what if there are “serious problems” with the Pisa data? What if the statistical techniques used to compile it are “utterly wrong” and based on a “profound conceptual error”? Suppose the whole idea of being able to accurately rank such diverse education systems is “meaningless”, “madness”? What if you learned that Pisa’s comparisons are not based on a common test, but on different students answering different questions? And what if switching these questions around leads to huge variations in the all- important Pisa rankings, with the UK finishing anywhere between 14th and 30th and Denmark between fifth and 37th? What if these rankings – that so many reputations and billions of pounds depend on, that have so much impact on students and teachers around the world – are in fact “useless”?”

    From “Is Pisa fundamentally flawed?” by William Stewart. Published in TES magazine on 26 July, 2013.

  • James

    The nation’s students came out with a high ranking in a global academic survey and there are only two comments here, both of which are refuting the methods used.

    Who wants to bet that had the rankings been on the lower end, this article would have been inundated with dozens of comments criticizing the entire education system, the parents, government and society?

    • Franz Pichler

      yeah, spot on!

    • Bruce Chatwin

      James, the PISA results are fundamentally flawed wherever they may rank an OECD
      member. Take a look at the Times Educational
      Supplement article on the PISA rankings. That countries evaluate the success of
      their school systems and as a result spend (or don’t as their position in the
      rankings purport to indicate) taxes on those systems based on the PISA rankings
      is very disturbing. The Times Educational Supplement article points out that Professor
      Svend Kreiner of the University of Copenhagen, Denmark, has looked at the
      reading results for 2006 in detail and notes that another 40 per cent of
      participating students were tested on just 14 of the 28 reading questions used
      in the assessment. So only approximately 10 per cent of the students who took
      part in Pisa were tested on all 28 reading questions. “This in itself is
      ridiculous,” Kreiner tells TES. “Most people don’t know that half of the
      students taking part in Pisa (2006) do not respond to any reading item at all.
      Despite that, Pisa assigns reading scores to these children.”

      • James

        You’re replying to my post with an off-topic answer. I couldn’t care less in the talk for or against the validity of the PISA methods.

        I will though give my two cents worth: *any* assessment procedure of whatever kind can be shown to be flawed. Whether its the method FIFA uses to rank soccer-playing countries, or which countries are supposedly the happiest or whatever. In any case, there *will* always be some who’ll decry it.

        I just couldn’t help myself from commenting when I saw people moaning that Japanese students ought not to be rated No. 1 but No.2 or 3. ;D

        Does the differenence of a few ranks mean so much?

        Oh, yes it does. It gives people something to whine about.

        That, my friend, was what my post was all about.