KERA (1990-2010): What have we learned?

From FreedomKentucky
Jump to: navigation, search
KERAcover.png

“I came on the local (school) board in 1987. What you just said to me is no different than what I heard in 1987. So why should I be hopeful?” - Kentucky Board of Education Chair Joe Brothers’ reaction to proposals to fix Kentucky’s continuing education problems during the October 2009 meeting of the Kentucky Board of Education

Download full report

Executive Summary

Not all of the policies contained in the Kentucky Education Reform Act of 1990 (KERA) have failed. However, it is evident that many Kentuckians have come to share Brothers’ frustration because a great many aspects of KERA have not worked out as promised.

In its 20 years of existence, a large number of KERA’s major efforts have proven unsuccessful. This despite the fact that educators assured citizens time and again – often using terms like “the research shows” – that their fad ideas were succeeding. In consequence, educator credibility has dropped to the level where policymakers like Brothers now feel led to make comments like the one above.

As Kentucky embarks on a serious rebuilding of the state’s education standards and the related assessment program and considers other efforts like charter schools to genuinely and thoroughly reform public schools, this paper presents a listing of major KERA initiatives that Kentuckians were assured were supported by research but which did not work in practice.

A prime goal of this report is to insure that we avoid reintroducing any of these failed ideas as Kentucky’s education standards and the related assessment program are rebuilt.

Another goal is to raise awareness of the fact that today’s educators still don’t know enough about how kids learn. Arthur Levine, former president of Teachers College at Columbia University, has noted: “There is widespread disagreement among policymakers, researchers and practitioners about what constitutes good research and how to prepare education researchers.” As Levine and others have realized, most education research is not conducted with sufficient rigor to “show” anything with an adequate degree of certainty.

However, we do know somewhat more than we did in 1990. We now know that many major programs failed under KERA, including the following:

  • Writing portfolios for accountability – While a good instructional tool when used properly, when used as part of the assessment program, writing portfolios actually impair writing instruction. Thus, they were removed from the assessment program by Senate Bill 1 in 2009.
  • De-emphasis on grammar, spelling and punctuation – The argument was made that students would learn these skills naturally as they wrote. That did not happen. Adverse effects of the de-emphasis were intensified by arcane rules that impacted both the instruction of writing and the grading of writing portfolios. That grading process used “holistic” approaches that de-emphasized writing mechanics. All of these arcane rules were intended to prevent cheating. Thus, these teacher-limiting rules became an unintended, unwanted consequence of using writing portfolios in the state assessment program.
  • Mathematics portfolios in assessment and accountability – Instead of doing math, students were mostly just writing about it. Math portfolios – included in the state’s assessment from 1993 to 1996 – were discarded by the legislature following numerous complaints from teachers and parents.
  • Fuzzy math – Pushing process to the detriment of students’ learning standard math operations like division and fractions was found to adversely impact all higher level math courses from algebra up both in K-12 schools and in postsecondary institutions. The math program also tried to cover too many topics, becoming a “mile wide-inch deep” in the process.
  • Whole language reading – The de-emphasis of phonics in the early days of KERA was misguided from the onset.
  • Ungraded Primary (formerly Kindergarten to third grade) – The ungraded primary concept required teachers to make constant judgments of student performance and to individually determine when students were ready for fourth grade. Today, ungraded primary technically remains on the books but the real concept is generally unenforced and often totally ignored. Furthermore, new research discussed in the body of this report indicates that there is no difference in performance between Kentucky elementary schools still using ungraded primary and those schools that have reverted to a standard, grade-by-grade organization.
  • Performance Events – These unusual assessment items required students to work in small groups to solve a problem. The students then individually wrote reports to be graded at a central site. A typical fourth-grade performance asked students to estimate the number of lady bug cartoon images presented on a piece of paper. The intent was to see if students would do something like dividing the paper into fourths, counting the bug images in that section, and then multiplying by four to get the total estimated number of images. Of course, by the fourth grade it should have been trivially easy for students to simply count all the images on the paper. In the end, issues of creating, linking and equating different performance events from year to year proved unworkable, and this “great hope” of KERA’s assessment program failed disastrously in 1996. The middle school performance event selected that year was far too difficult, and all middle schools got dramatically low scores. Eventually, the legislature threw all performance events results out of the evaluation process.
  • First testing company was an unknown with limited resources – Despite strong support from the Kentucky Department of Education (KDE), Advanced Systems in Measurement and Evaluation proved unequal to the challenge of performance events and was terminated for cause following a scoring fiasco resulting in inaccurate scores for every elementary and middle school in the state. The erroneous scores, in turn, caused improper distribution of $25 million in reward money. Kentucky taxpayers had to provide an additional $1.5 million to pay rewards schools earned but had not received due to the scoring error.
  • Kentucky Instructional Results Information System (KIRIS), Kentucky’s first assessment program – Disbanded for poor performance in 1998 by the legislature. Scores inflated while other indicators of performance such as the National Assessment of Educational Progress (NAEP) and the ACT college entrance test indicated Kentucky education was making little progress.
  • Multiple choice questions – In, sort of, at the inception of KIRIS testing in 1992. Then out completely after 1994. Following considerable criticism of that 1994 decision, back in, sort of, from 1997 through 1998. Finally accepted as necessary and included in the accountability formula for the first time after KIRIS was disbanded following its 1998 administration.
  • Commonwealth Accountability Testing System (CATS), Kentucky’s second assessment program – Launched in 1999 and disbanded for cause in 2009. Again, scores were inflated beyond anything that could be supported from other data sources, including a chronically high statewide remediation rate for entering college freshmen and continued severe inflation of proficiency rates compared to the NAEP.
  • Dropout rates for school accountability – Used for school accountability from the beginning of KIRIS in 1992. Found highly inaccurate in an official audit by the Kentucky Auditor of Public Accounts in 2006. Focus shifted to equally unreliable graduation rate statistics based on the same inaccurate dropout figures after No Child Left Behind began. Now, slated to be replaced by a more carefully researched graduation rate formula in 2011. Internal audits by the Kentucky Department of Education, if conducted at all, never revealed problems found in the official 2006 audit.
  • Site Base (or School Based) Decision Making Councils (SBDM) – This KERA statewide creation stripped locally elected school boards and superintendents of all real authority in schools. Instead, key decisions on matters like curriculum and final funds distribution became the purview of these teacher-controlled councils. Along the way, SBDM thoroughly destroyed the logical chain of command and accountability in education and replaced it with a chaotic system where accountability is unclear and difficult to enforce. Superintendents can still fire principals on paper, but doing so in practice is often unworkable because the SBDM can actually rehire the same person. Overall, the SBDM governance model thoroughly eliminated school accountability to local taxpayers and parents.
  • Accurate reporting of data and implications – This problem was extensively highlighted in a study conducted for the Kentucky Office of Education Accountability in 1995. Though somewhat improved over time, KDE’s data reporting still suffers from “spin.” One notable example was a recent effort to protect the inflated CATS assessment with the KDE-operated 2008 CATS Task Force. KDE tried to hog-tie the task force so only minor changes would be recommended for CATS. That KDE effort ultimately proved a failure when the legislature voted to disband CATS in early 2009.
  • School accountability – KIRIS and CATS identified very few schools as problematic although external data from NAEP, ACT, No Child Left Behind (NCLB) and high college remediation rates indicated problems were far more extensive. In particular, NCLB firmly exposed the very serious failure of CATS to identify and penalize achievement gaps for disadvantaged students. However, only a few schools have lost their site base council authority due to low performance. For example, in 2008, Kentucky had 34 schools in the lowest NCLB performance category of “Tier 5.” Those schools consistently failed to make adequate progress for at least six years and were supposed to implement alternative governance plans. Yet in that same year, only four schools lost their SBDM authority, all due to poor performance on CATS. Schools did not start to lose SBDM authority under federal accountability efforts such as Race to the Top and NCLB until 2010. To date, only 10 SBDMs have been impacted by federally required accountability programs, and all of those are due to Race to the Top/School Improvement Grant program related activities.
  • Educator accountability – Despite a great deal of commentary about Kentucky wanting highly skilled teachers in every classroom, the state’s education system has no formal process to evaluate and remove tenured teachers who fail to do even a minimally acceptable job of educating students. Even in cases where teachers have committed serious crimes, it can be time-consuming and expensive to suspend or revoke the teaching certificates of individuals who clearly have no place in the classroom. For teachers who simply lack competence, no formal process exists to describe the steps to take for decertification. The Kentucky Educational Professional Standards Board informs us that no Kentucky teacher has ever had a certificate suspended or revoked for such reasons.
  • Overcorrection for funding inequity – When KERA began, districts with low per-pupil property wealth generally also had schools with low total per-pupil funding. That process has now sharply reversed. For example, students in the high property wealth Boone County Public Schools in Northern Kentucky sometimes attend classes in portable buildings while “down state,” low property wealth schools are now housed in beautiful and modern buildings.
  • Bang for the buck efficiency has steadily declined – Even though test scores have risen slightly over time, the costs involved have skyrocketed. As a result, the dollar cost for each test score point on the NAEP and the ACT college entrance tests have both been steadily rising since KERA began. The Kentucky Constitution requires the state to establish an efficient school system, so the rapid rise in cost versus performance may not comply with the obvious intent of this requirement.

Download full report

See Also

http://www.bipps.org/bipps-blog/http://www.facebook.com/pages/The-Bluegrass-Institute/58521621985?ref=tshttp://www.vimeo.com/freedomkyhttp://twitter.com/BIPPShttp://www.youtube.com/user/FreedomKentuckyIcons.png