Quietly, a New Test Gains Advocates
As educators and policymakers continue to debate the value of the new Common Core assessments and other mandatory assessments, a small but growing number of schools and districts are signing up to participate in a new and different kind of test: the OECD Test for Schools, a voluntary assessment designed by the Organisation for Economic Co-operation and Development to gauge the thinking skills and attitudes of 15-year-olds.
Piloted during the 2012–13 school year at 105 U.S. schools, including a mix of traditional, magnet, and charter schools (and one private school), the new test was administered at nearly 300 schools this school year. Results will be available to participating schools as early as this month.
Participation is entirely confidential, as are the results. Yet leaders in a handful of schools have publicized their participation and posted their 2012–13 scores. Most say they were looking for a more challenging test than their state tests as a way of preparing teachers and students for the type of problem-solving tasks in the Common Core tests now in development. They also said that with all the emphasis on global competition, they were curious how their own students stacked up against students from other countries. A few participated in order to demonstrate that low-income minority students could perform as well or better than anyone else nationally or internationally.
Using complex statistical sampling methods, the new OECD Test for Schools compares the test results of students from an individual school to those of students in countries that take the OECD’s Programme for International Student Assessment (PISA) every three years, including the U.S. (as well as the states of Florida, Massachusetts, and Connecticut). For the school-level exam, students respond to approximately two hours of test questions in reading, mathematics, and science, and answer a 30-minute student questionnaire. The testing requires a half-day of school, at least 75 student volunteers (who are representative of the student body) and is administered by CTB/McGraw Hill.
OECD does not provide rankings, but does report results in ranges that schools can (and do) use to rank themselves compared to results from other countries. Although these early administrations have been partly subsidized by private philanthropies, most districts will have to pay $11,500 per school in order to participate starting next year, according to Peter Kannam at America Achieves, a nonprofit that has been recruiting new schools and coordinating exchanges among participants.
According to the OECD website, the school-level test “provide(s) important peer-to-peer learning opportunities for local educators—locally, nationally, and internationally—as well as the opportunity to share good practices to help identify ‘what works’ to improve learning and build better skills for better lives.”
Unlike existing state tests designed to measure content knowledge, the OECD Test for Schools “goes beyond assessing whether students have mastered what they were taught,” says OECD Deputy Director for Education Andreas Schleicher, the test’s chief proponent. It gauges their “capacity to extrapolate from what they know and apply their knowledge in novel situations.” Both the PISA framework—upon which the new test is based—and the Common Core “place great emphasis on deep conceptual understanding rather than routine cognitive skill only,” he says.
Possibly because schools sign up for the test voluntarily—and in many cases students participate voluntarily—its use hasn’t generally raised concerns among parents or critics of annual standardized testing.
In May 2012, a group of 15-year-olds who attend North Star Academy College Preparatory High School, a Newark, N.J., charter school, were among the first U.S. students to participate in the new test. A reading question presented a passage about an Indian mystic and asked students to explain the author’s attitude towards the mystic’s claims and defend their interpretation with examples. A math question showed rates and terms for various mobile phone plans and asked students to list the advantages of one of the plans. The test concluded with questions that asked students about the quality of teacher-student relations, the disciplinary climate in classes, their level of engagement with mathematics and science, and the amount and kind of reading that they do on their own.
Several weeks after the test was administered, North Star administrators received a colorful 100-plus page report entitled “How Your School Compares Internationally.”
The results were eye-opening as well as galvanizing. In reading, the school came in well above the U.S. average. However, in math, North Star came in below OECD’s average (if slightly above the U.S. average). In science, it came in below the U.S. average (if slightly above the OECD average). The student questionnaire also showed that students’ interest in math and science generally were lower than the U.S. average.
“We’re usually not below average in anything,” said Michael Mann, North Star’s head of school, who added that he decided to pilot the test to demonstrate that low-income minority students, like those in his school, could compete internationally if given the right instruction and other supports.
Mann first shared the results with teachers, parents, and board members. Then each department met to dig into the test score details and the student survey answers and come up with possible changes. North Star ended up swapping out a senior year research project and replacing it with a science lab internship at nearby Seton Hall University, added AP Chemistry, created computer science classes (AP and elective), and began overhauling its curriculum to create a new engineering “major” starting next fall. A year after taking the pilot test, more than 50 percent of the seniors said they planned to be STEM majors in college.
“The process was amazing,” says AP Biology teacher Syrena Burnam, who helped implement the changes. “I feel really empowered by the experience.”
Curiosity about where their students stood in relation to other nations also led Principal Teresa Johnson of Chantilly High School to have her students participate in the 2012 pilot. “We call ourselves a ‘world-class education system,’” says Johnson, whose school is located in suburban Fairfax County, Va. “I was curious if we really are.”
Over all, Chantilly performed much better than the U.S. average in math and about average in reading; however, the results weren’t the kind of figures that Chantilly usually got on tests. The questionnaire also revealed that 44 percent of the students were “deeply and highly restrictive readers” who didn’t read for enjoyment. And Johnson was somewhat startled to see how students felt about the learning environment. “We always thought, ‘Our teachers are so great, our kids love it here,’” she says. From the student survey responses, however, “We got a different perception of how things feel.”
In response, Chantilly High School is working on bringing deeper reading, application of knowledge and critical thinking to its classrooms, and fostering closer relationships with students. But it’s going to take some time, says Johnson. “For years we’ve been telling them to help kids pass the [state tests],” she says of her 220 classroom teachers. “And now I’m telling them, ‘Don’t worry about the [state tests] and instead focus on these [PISA] skills.’”
Some responses to the test results have been troubling to outside observers.
When Blue Valley School District, a top-ranked district in Kansas got the 2012 results from its five high schools, gratified administrators found that their students did as well or better than other well-off schools in the U.S. and was placed near the top of the international rankings—7th in math and 4th in reading— right behind Finland. In order to spur teachers and students to do even better on the 2013–14 test, Blue Valley began touting its efforts on social media like Twitter using the hashtag #beatfinland.
Some observers, however, like Finnish educator and author Pasi Sahlberg, an expert on the PISA test, believe that this particular practice is inaccurate and unhealthy. “You cannot really compare a school to a system,” Sahlberg says.
“That isn’t what the test is meant to do,” says Alejandro Gomez-Palma, an OECD policy analyst. “We don’t compare schools to countries.”
However, the temptation to rank schools can be too much to resist. When the Arizona-based BASIS network of schools decided to open a new school in Brooklyn, it took out an ad touting its 2012 scores in the New York Times and posted them online.
Still, participants suggest the wealth of information the test provides may be worth the risk that some may fixate on a single test or limit their follow-through to marketing.
Although results for Arroyo Grande High School, the only pilot participant from California, weren’t great, the lengthy report OECD sent back was “better than anything we have ever gotten from any other standardized tests,” according to testing coordinator Hillery Dixon.
Both test results and survey responses pointed to the need to put a stronger focus on critical thinking, problem-solving, and written responses. And so, when the time came around last year to decide whether to participate in the OECD test again, Arroyo Grande re-upped along with two other high schools in the district.
After sitting out the pilot round, Montgomery County, Md., joined the other districts participating in the 2013–14 test. “There’s this huge assessment void” while Common Core assessments are being rolled out, says Superintendent Joshua Starr. “We need a dataset that can help us now.”
In addition, Starr says, the school-level tests are “a great way to communicate to teachers and others what international standards are really all about.”
Alexander Russo is a freelance education writer whose work can be seen atwww.thisweekineducation.com or via @alexanderrusso.