Affiliated with PerformanceAssessment.Org
Contact us at firstname.lastname@example.org
New York City Schools Chancellor Joel Klein and Chief Accountability Officer James Liebman repeatedly cite a recent study of Arizona schools, Why Some Schools with Latino Children Beat the Odds… and Others Don’t (Waits, Campbell, Gau, Jacobs, Rex, & Hess, 2006), as supporting evidence of their new policy of periodic assessments.
Here is an analysis of this study. (You can download the study itself: click here)
Why Some Schools with Latino Children Beat the Odds… and Others Don’t (Waits, Campbell, Gau, Jacobs, Rex, & Hess, 2006) is premised on the idea that “success in education these days is measured by test scores” (p. 13). However, this perspective is not universally shared. Noted testing experts Robert Linn (2000) and William Mehrens (1998), and researchers from RAND (Klein, Hamilton, McCaffrey, & Stecher, 2000) have all concluded that rising test scores mean little more than students are achieving in a very limited domain of knowledge, i.e., that which is on a particular test, and most likely are being taught a very narrowed curricula.
However, if we do adopt the report’s perspective and equate test scores with achievement, some interesting facts emerge from an analysis of the 12 schools in the study that “beat the odds.”
First, the three “steady performer” schools, those that perform consistently above the state average on the Stanford 9 test, have a relatively small Spanish-speaking population: 0%, 27% and 38%. Two of these are back-to-basics alternative schools; that is, parents choose to send their children there and it not known from the report what the admissions criteria are and whether students are academically screened before admitted. Furthermore, schools that focus on a back-to-basics curriculum might be expected to do well on a test such as the Stanford 9, which tends to measure lower-level skills.
Second, practically all twelve “beat-the-odds” schools had substantially fewer Spanish speakers, some as much as 38 percentage points lower, than the study’s comparison schools, i.e., those schools that were matched for demographic similarities but had worse test results. This disparity calls into question the validity of these school comparisons.
Third, the enrollment for the third grade in some of the “beat-the-odds” schools is very low, e.g., 34 students, throwing the reliability of the schools’ test results into question.
Finally, many researchers caution against using a single test to determine achievement, instead urging the use of a second test, such as the NAEP (National Assessment of Educational Progress), as a check (Amrein & Berliner, 2002; Klein, Hamilton, McCaffrey, & Stecher, 2000). This study shows the scores of just one test, the Stanford 9, thus questions arise as to whether teachers were teaching specifically to it, thus skewing the results.
The Report and Klein’s Initiatives
While Chancellor Klein and Chief Accountability Officer Liebman have embraced this study as evidence of the beneficial effects of periodic testing, they have apparently ignored one of its most important findings: a school must feel ownership of its curriculum and assessments. The study’s authors warn that top-down models do not work. They write, “‘Fixing’ the school doesn’t usually come from ‘out there’ – not from the almost daily onslaught of flavor-of-the-month education reform programs or from the changes imposed from the outside by the school district, the state legislature, or from the Federal government” (p. 45).
For a school to achieve success, the study maintains, the principal and staff must be able to choose, thus “buy into” curriculum and assessment approaches from an array of successful options. Though it is true that under the DOE’s plan, schools may choose to design their own interim assessments, it remains to be seen how many schools will feel they have the capacity, time, and energy to do so. We speculate that many schools, compelled by the DOE’s top-down mandate to conduct some form of periodic assessment, will really have no choice but to employ the DOE’s standard periodic test, a test which has not yet been developed. Furthermore, it remains to be seen what the schools will actually do with all this data, especially if they are doubtful as to its worth, once the test scores are returned.
A Final Note
While Chancellor Klein lauds the test results of these 12 “beat-the-odds” schools in Arizona, a recent study by Jeff MacSwan (2006) and colleagues at Arizona State University concluded that Arizona’s Proposition 203, which mandates the teaching of English only to non-English speakers, has failed the state’s Latino/a students. Specifically, the study found that 89 percent of immigrant children who scored non-proficient in 2003 were still not proficient in English a year later and only 29 percent of all English-learners, regardless of initial proficiency level, showed any growth in English ability at all. These findings are alarming. They also help to underline the fact that the “steady performer” schools had relatively small percentages of Spanish-speakers, making these schools less vulnerable to test failures.
Amrein, A. L., & Berliner, D. C. (2002). High-stakes testing, uncertainty, and student learning. Education Policy Analysis Archives, 10 (18). Retrieved 7/28/04 from epaa.asu.edu.epaa/v10n18/.
Klein, S. P., Hamilton, L. S., McCaffrey, D. F., & Stecher, B. M. (2000). What do test scores in Texas tell us? Santa Monica, CA: RAND Corporation.
Linn, R. L. (2000). Assessments and accountability. Educational Researcher, 29 (2), 4-14.
MacSwan, J. Policies fail state’s growing English-learner population. The Arizona Republic, May 21, 2006. Retrieved 5/25/06 from www.azcentral.com/arizonarepublic/
Mehrens, W. A. (1998). Consequences of assessment: What is the evidence? Education Policy Analysis Archives, 6 (13). Retrieved 7/28/04 from epaa.asu.edu/epaa/v6n13/.
Waits, M. J., Campbell, H. E., Gau, R., Jacobs, E., Rex, T., & Hess, R. K. (March, 2006). Why Some Schools with Latino Children Beat the Odds… and Others Don’t. Phoenix, AZ & Tempe, AZ: Center for the Future of Arizona & Morrison Institute for Public Policy at Arizona State University. Retrieved 5/23/06 from www.asu.edu/copp/morrison/LatinEd.pdf/.
|produced by Naava Katz Design|