Programme for International Student Assessment

From Wikipedia, the free encyclopedia

Jump to: navigation, search

The Programme for International Student Assessment (PISA) is a triennial world-wide test of 15-year-old schoolchildren's scholastic performance, the implementation of which is coordinated by the Organisation for Economic Co-operation and Development (OECD).

The aim of the PISA study is to test and compare schoolchildren's performance across the world, with a view to improving educational methods and outcomes.

Contents

[edit] Development and implementation

Developed from 1997, the first PISA assessment was carried out in 2000. The tests are taken every three years. Every period of assessment specialises in one particular subject, but also tests the other main areas studied. The subject specialisation is rotated through each PISA cycle.

In 2000, 265 000 students from 32 countries took part in PISA; 28 of them were OECD member countries. In 2002 the same tests were taken by 11 more "partner" countries (i.e. non-OECD members). The main focus of the 2000 tests was reading literacy, with two thirds of the questions being on that subject.

PISA’s debut round in 2000 was delivered on OECD’s behalf by an international consortium of research and educational institutions led by the Australian Council for Educational Research (ACER). It continued to lead the design and implementation of subsequent rounds of PISA for OECD.

Over 275 000 students took part in PISA 2003, which was conducted in 41 countries, including all 30 OECD countries. (Britain data collection however, failed to meet PISA’s quality standards and so the UK was not included in the international comparisons.) The focus was mathematics literacy, testing real-life situations in which mathematics is useful. Problem solving was also tested for the first time.

In 2006, 57 countries participated, and the main focus of PISA 2006 was science literacy. Results are due out in late 2007. Researchers have begun preparation for 2009, in which reading literacy will again be the main focus, giving the first opportunity to measure improvements in that domain. At last count (end-March 2007), about 63 countries were set to participate in PISA 2009. It is anticipated that more countries will join in before 2009.

Development of the methodology and procedures required to implement the PISA survey in all participating countries are led by ACER. It also leads in developing and implementing sampling procedures and assisting with monitoring sampling outcomes across these countries. The assessment instruments fundamental to PISA’s Reading, Mathematics, Science, Problem-solving, Computer-based testing, background and contextual questionnaires are similarly constructed and refined by ACER. ACER also develops purpose-built software to assist in sampling and data capture, and analyses all data.

The process of seeing through a single PISA cycle, start-to-finish, takes over 4 years.

[edit] Comparison with TIMSS and PIRLS

Another international mathematics assessment test is the Trends in International Mathematics and Science Study (TIMSS), undertaken by the International Association for Evaluation of Educational Achievement (IEA). Results from the TIMSS often contradict results of the PISA test. The PISA mathematics literacy test asks students to apply their mathematical knowledge to solve problems set in various real-world contexts. To solve the problems students must activate a number of mathematical competencies as well as a broad range of mathematical content knowledge. TIMSS, on the other hand, measures more traditional classroom content such as an understanding of fractions and decimals and the relationship between them. It divides mathematical domains into two dimensions: first, the applied-knowledge "cognitive domains" and secondly more traditional "contents domains". The cognitive domains it covers are "Knowing Facts and Procedures, Using Concepts, Solving Routine Problems and Reasoning", and the contents domains are "Number, Algebra, Measurement, Geometry and Data". The latter reflect "the importance of being able to continue comparisons of achievement with previous assessments in these content domains" (TIMSS Assessment Framework 2003, pdf) PISA argues that international assessment should not be restricted to a set body of knowledge. Instead, it deals with education's application to real-life problems and life-long learning.

In reading literacy, the equivalent to TIMSS is the Progress in International Reading Literacy Study or PIRLS. According to the OECD: "OECD/PISA does not measure the extent to which 15-year-old students are fluent readers or how competent they are at word recognition tasks or spelling". Instead, they should be able to "construct, extend and reflect on the meaning of what they have read across a wide range of continuous and non-continuous texts" (Chapter 2 of the publication "PISA 2003 Assessment Framework", pdf) PIRLS, on the other hand, describes reading literacy as "the ability to understand and use those written language forms required by society and/or valued by the individual." (Chapter 1 of PIRLS 2006 Assessment Framework, pdf)-- PIRLS includes using language forms in reading literacy. However, according to the IEA, in scoring the PIRLS tests, "the focus is solely on students’ understanding of the text, not on their ability to write well." (Chapter 4 of PIRLS 2006 Assessment Framework, pdf).

[edit] Method of testing

The students tested by PISA are aged between 15 years and 3 months and 16 years and 2 months at the beginning of the assessment period. The school year pupils are in is not taken into consideration. Only students at school are tested, not home-schoolers. In PISA 2006 , however, several countries also used a grade-based sample of students. This made it possible also to study how age and school year interact.

Each student takes a two-hour handwritten test. Part of the test is multiple-choice and part involves fuller answers. In total there are six and a half hours of assessment material, but each student is not tested on all the parts. Participating students also answer a questionnaire on their background including learning habits, motivation and family. School directors also fill in a questionnaire describing school demographics, funding etc.

[edit] Results

The results of each period of assessment normally take at least a year to be analysed. The first results for PISA 2000 came out in 2001 (OECD, 2001a) and 2003 (OECD, 2003c), and were followed by thematic reports studying particular aspects of the results. The evaluation of PISA 2003 was published in two volumes: Learning for Tomorrow’s World: First Results from PISA 2003 (OECD, 2004) and Problem Solving for Tomorrow’s World – First Measures of Cross-Curricular Competencies from PISA 2003 (OECD, 2004d)

Here is an overview of the top six scores in 2003:

Mathematics Reading literacy Science Problem solving
1. Flag of Hong Kong Hong Kong 550
2. Flag of Finland Finland 544
3. Flag of South Korea South Korea 542
4. Flag of the Netherlands Netherlands 538
5. Flag of Liechtenstein Liechtenstein 536
6. Flag of Japan Japan 534
1. Flag of Finland Finland 543
2. Flag of South Korea South Korea 534
3. Flag of Canada Canada 528
4. Flag of Australia Australia 525
5. Flag of Liechtenstein Liechtenstein 525
6. Flag of New Zealand New Zealand 522
1. Flag of Finland Finland 563
2. Flag of Hong Kong Hong Kong 542
3. Flag of Canada Canada 534
4. Flag of the Republic of China Taiwan 532
5. Flag of Estonia Estonia 531
6. Flag of Japan Japan 531
1. Flag of South Korea South Korea 550
2. Flag of Finland Finland 548
2. Flag of Hong Kong Hong Kong 548
4. Flag of Japan Japan 547
5. Flag of New Zealand New Zealand 533
6. Flag of Macau Macau 532

Professor Jouni Välijärvi was in charge of the Finnish PISA study: he believed that the high Finnish score was due both to the excellent Finnish teachers and to Finland's 1990s LUMA programme which was developed to improve children's skills in mathematics and natural sciences. He also drew attention to the Finnish school system which teaches the same curriculum to all pupils. Indeed individual Finnish students' results did not vary a great deal and all schools had similar scores.

An evaluation of the 2003 results showed that the countries which spent more on education did not necessarily do better than those which spent less. Australia, Belgium, Canada, the Czech Republic, Finland, Japan, Korea and the Netherlands spent less but did relatively well, whereas the United States spent much more but was below the OECD average. The Czech Republic, in the top ten, spent only one third as much per student as the United States did, for example, but the USA came 24th out of 29 countries compared.

Compared with 2000, Poland, Belgium, the Czech Republic and Germany all improved their results. In fact, apparently due to the changes to the school system introduced in the educational reform of 1999, Polish students had above average reading skills in PISA 2003; in PISA 2000 they were near the bottom of the list.

Another point made in the evaluation was that students with higher-earning parents are better-educated and tend to achieve higher results. This was true in all the countries tested, although more obvious in certain countries, such as Germany.

[edit] 2006 survey

Here is an overview of the 20 places with the highest scores in 2006:

Mathematics Science Reading
1. Flag of the Republic of China Taiwan Flag of Finland Finland Flag of South Korea South Korea
2. Flag of Finland Finland Flag of Hong Kong Hong Kong Flag of Finland Finland
3. Flag of Hong Kong Hong Kong Flag of Canada Canada Flag of Hong Kong Hong Kong
4. Flag of South Korea South Korea Flag of the Republic of China Taiwan Flag of Canada Canada
5. Flag of the Netherlands Netherlands Flag of Estonia Estonia Flag of New Zealand New Zealand
6. Flag of Switzerland Switzerland Flag of Japan Japan Flag of Ireland Ireland
7. Flag of Canada Canada Flag of New Zealand New Zealand Flag of Australia Australia
8. Flag of Macau Macau Flag of Australia Australia Flag of Liechtenstein Liechtenstein
9. Flag of Liechtenstein Liechtenstein Flag of the Netherlands Netherlands Flag of Poland Poland
10. Flag of Japan Japan Flag of Liechtenstein Liechtenstein Flag of Sweden Sweden
11. Flag of New Zealand New Zealand Flag of South Korea South Korea Flag of the Netherlands Netherlands
12. Flag of Belgium Belgium Flag of Slovenia Slovenia Flag of Belgium Belgium
13. Flag of Australia Australia Flag of Germany Germany Flag of Estonia Estonia
14. Flag of Estonia Estonia Flag of the United Kingdom United Kingdom Flag of Switzerland Switzerland
15. Flag of Denmark Denmark Flag of the Czech Republic Czech Republic Flag of Japan Japan
16. Flag of the Czech Republic Czech Republic Flag of Switzerland Switzerland Flag of the Republic of China Taiwan
17. Flag of Iceland Iceland Flag of Macau Macau Flag of the United Kingdom United Kingdom
18. Flag of Austria Austria Flag of Austria Austria Flag of Germany Germany
19. Flag of Slovenia Slovenia Flag of Belgium Belgium Flag of Denmark Denmark
20. Flag of Germany Germany Flag of Ireland Ireland Flag of Slovenia Slovenia

[edit] Reactions to the results

For many countries, the first PISA results were a nasty surprise; in Germany, for example, the comparatively low scores brought on heated debate about how the school system should be changed. Other countries had an agreeable surprise. Some headlines in national newspapers, for example, were:

[edit] See also

[edit] References

  • Rindermann, Heiner (2007). The g-factor of international cognitive ability comparisons: the homogeneity of results in PISA, TIMSS, PIRLS and IQ-tests across nations. European Journal of Personality, 21, 667-706 [1]

[edit] Further information

[edit] Official websites and reports

International:

National:

[edit] Criticism

Personal tools