Connecticut Schools Excelling without Testing?

Posted on

When it comes to standardized testing, there are a number of stories we can tell. One of the dominant stories about education goes something like this: ‘Kids must take standardized tests every year or else we can’t hold adult accountable for learning – all kids, every year, in all schools. Parents need to know how their children are doing in school. For low-income, Black, and/or Latino & Puerto Rican children this is doubly true. Once the test data comes in, then assign merit to individual students. For groups, punish the “low-performers” and reward the “high-performers”.’

Another story is that Black, Latino, and Puerto Rican children and adults have been forced to take standardized tests since IQ tests were first administered in the 1910’s. This testing served to mask and justify the underinvestment, academic tracking, segregation, and miseducation in schools. (Au, 2008) Within these conditions, we persisted. Occasionally, lawyers won a civil rights case by citing test data. Oftentimes the tests helped authorities select the most meritorious among us for advancement.

But what would happen if there weren’t high-stakes tests to sort children and rate teachers, schools, and districts? Would the schools still function? Would kids still learn? How would we know who to reward and who to punish? How might this disrupt these two very different stories about testing?

There are, in fact, a small number of public schools where kids don’t take high-stakes tests in Connecticut. For these schools, the CT Department of Education does something very unusual, contrary to what we are told should or needs to be done to schools. CT SDE simply invents rankings for these schools. No tests needed. As former CT State Representative Jonathan Pelto might say, “Wait, what?”

Under the original NCLB law, the state judged schools using kids’ test results, then labeled them as “In need of improvement” or “Not Making Annual Progress”/”Made Annual Yearly Progress”. With the NCLB Waiver in 2012, CT schools instead received new, more pleasant-sounding labels such as “excelling”, “progressing”, “transitioning”, “review”, “focus”, or “turnaround.” These labels were similarly based on standardized test data, as well as a few other data points such as graduation rates when they were high schools.

In Connecticut, there were 60 schools where children did not take high-stakes tests in 2012-13. So those schools can’t be labeled something like “excelling” or “transitioning”, right?

Wrong.

The 60 schools where kids haven’t had high-stakes tests were in two main categories. The categories included:

a. Schools that only served grade pre-k to grade 2

b. Schools that only served grade 9, or grade 11, and/or grade 12

These grades did not have high-stakes tests such as the CMT or CAPT in 2012-13. In Connecticut, children in public schools in pre-kindergarten through grade 2, grade 9, grade 11 and 12 did not have annual, high-stakes tests in 2012-13. But these 60 schools only had children in these “non-tested” grades. Still, they may have had local diagnostic tests or assessments such as the “DRA” to check in on kids’ academic progress. The high schools may have had graduation rate data that could be used to help assign a rating. But the key data point to make the rankings–the test results–didn’t exist.

Additionally, there were two schools (+2) with fewer than 20 students taking the tests, so their test results weren’t used for “accountability” purposes, but they were given a rating. (Briggs High School and Explorations charter school)

CT SDE found a way to work around the fact there wasn’t any test data to label these 60 (+2) schools. According to its school “performance” reports, the State of Connecticut, “analyzed district-wide data and applied the results of those analyses to schools without tested grades.”

This means that the State just looked at how the kids in all the other schools in the same district did on the standardized tests (CMT or CAPT) in grades 3 to 8 and 10, then applied the most frequent ranking across the district to any schools without tests.

Lost?

Here’s the State’s explanation from the school “performance” reports from 2012-13. In this example, Coventry Grammar School was labeled “Excelling” based on the test results from the other schools in Coventry.

Screen Shot 2015-03-15 at 9.33.33 PM

So to recap.

As far as we can tell from the state’s open data, the students in Coventry Grammar School did not take the CMT, CAPT, SBAC, PARCC, or any other high-stakes tests in 2012-13. The school only had children in kindergarten, grade 1, and grade 2. As of this writing, these grades were “untested” by the state, so the whole school was “untested” by CT SDE. Yet, this school was labeled as “excelling” based on the test results from other schools in Coventry. That’s excelling without testing!

Because these schools didn’t take the tests for one of the reasons listed above, the State Department of Education simply made up a ranking based on data from other schools. As the report reveals, this practice served to appease the Federal Department of Education.

So who were these schools? They were primarily suburban and rural schools with only elementary school grades, such as pre-k to grade two and a handful of high schools too.

These 60 schools had, on average (mean), 310 students, 66% white students and 27.5% student eligible for free or reduced price lunch. These schools were generally more affluent, more white, more suburban, and had smaller enrollments than the average school in Connecticut in 2012-13. Here is a full list.

Connecticut Public Schools Without High-Stakes Testing – 2012-13

 

Among this group, and in order of relative prestige, 7 schools were placed into the “excelling” category, 16 were labeled “progressing”, 34 “transitioning”, 4 in “review”, 1 in “focus”.

Only one school among this group-Ramón E. Betances Early Reading Lab School-was labeled in the “turnaround” category. It’s somewhat unclear how the school got this ranking.

Betances was (and is) arguably one of the more isolated schools by race, ethnicity, income, and language status in Hartford. In the past, overall test results were relatively lower than the city’s magnet schools and suburban schools. But the school also went through a “reconstitution” a few years ago in the name of raising test scores. (i.e. firing all staff because of low-test scores and bringing in new people)

However, CT SDE did not report any test data for this school in 2012-13, even though it did have a third grade class, a “tested” grade. The State provided no performance report for this school either.

Again, it’s unclear why the school received the ranking without any test data. As a school that had a federal “School Improvement Grant”, CT SDE may automatically placed Betances into the “turnaround” category. It’s also important to note that the Hartford Courant reported that there was a CT SDE investigation into accusations of test cheating at the school, so maybe that’s a part of all this case.

All this labeling, ranking, and sorting without standardized testing in these schools!

To be sure, no Connecticut public school students in grades pre-kindergarten, grades 1, 2, 9, 11, & 12 never took the CMT or CAPT high-stakes tests. (I know, that kind of bursts the, “All kids need to be tested every year” bubble.) So these 60 schools were special cases of schools with children entirely enrolled in non-tested grades.

By assigning these schools a ranking based on the test results of other schools in the same district, CT SDE quietly patched up another hole in the test machinery. But this patch opened up another hole in the story we are told that all kids need to be tested every year in all schools, or else!

This patch is difficult for CT SDE to defend. It busts a hole in the claim that tests are needed to tell parents how their children are doing in school. And this quick fix comes as more evidence surfaces that CT SDE has actively suppressed parents’ rights to refuse the tests, or opt out, as protest towards better education policy and practice.

The concept of more holistic accountability of schools is defensible, including periodic testing of a representative sample of the students in a district or school to get a picture of basic academic skill development. This is how the NAEP has worked for decades – testing a representative sample of kids in a handful of grade levels, not every kid, every year, in every grade. This is similar to how public opinion polls and the U.S. Census or American Community Survey work.

Remember, the dominant story says, “Testing is doubly important for low-income, Black, and Latino kids so they can learn. Without the testing, how can we know the kids are learning? How can parents know how their children are doing?” But these more advantaged schools didn’t need high-stakes testing to exist, get support from their communities and the state, communicate to parents how students were doing in school (I’m guessing that they have found other approaches), or receive a ranking that lead to the rewards and punishments that we are told they need. How was it that these more advantaged schools communicated children’s academic progress without high-stakes tests every year?

There are drastic differences in power between parents and schools in white compared to communities of color. So this testing business can play out differently in schools. Periodic testing might be used as check-in and provide data that can empower low-income, people of color. But annual high-stakes testing (HST) isn’t designed to erase historic inequities. Instead, high-stakes testing more often exacerbates those inequities. (Valenzuela, 2004)

Take Hartford, CT, where I live and I’m on the school board. About half of Hartford is Latino or Puerto Rican, a third of the population is Black (with many West Indian families) and 10-15 % white folks. The testing that supposedly offers parents more power is often disempowering.

How?

Occasionally, parents speak about the test data to get attention, then their concerns pivot to mismanagement, mistreatment, under- or selective investment, and miseducation. The same test data is used by people with more power to justify mismanagement or under-service, close schools, offer hefty bonuses to teachers, staff, and administrators, provide more school choice “options”, define kids as “gifted” or “deficient”, and make claims about the people in the schools. Parents have power and agency, but testing doesn’t erase the differences in institutional power that they face. Instead of help in Hartford, the schools often get more tests.

The State already claims that schools are excelling, progressing, and transitioning without testing. A stale vision for public schools might call for more, better tests and more data. Why can’t all schools move forward and ditch the high-stakes tests and replace this enterprise with a different system of accountability?

A new vision must remember the other story about testing. That story says that low-income, Black, Latino & Puerto Rican kids have received underinvestment, tracking, segregation, and miseducation. Over the years, we’ve received plenty of testing and it has been used to obscure deep inequity. Still, we have persisted and navigated this world.

If remember this story, then all this testing business becomes more curious. If the testing stopped, then we might have time to look back and think more deeply about why all of our schools aren’t always “excelling” despite all the annual, high-stakes testing.

(Note: If you learn, work, or send your children to any of these schools, I would love to hear how they manage without all the testing. Comments are activated.)

Published by

Robert Cotto Jr.

Robert Cotto, Jr. is a Lecturer in the Educational Studies department. Before his work at Trinity, he was a Senior Policy Fellow in K-12 Education for CT Voices for Children where he published reports on Connecticut’s testing system, public school choice, and K-12 education data and policy. He taught for seven years as a social studies teacher at the Metropolitan Learning Center for Global and International Studies (MLC), an interdistrict magnet school intended to provide a high-quality education and promote racial, ethnic, and economic integration. Born and raised in Connecticut, Mr. Cotto was the first in his family to go to college and he earned his B.A. degree in sociology at Dartmouth College, his Ed.M. at Harvard University Graduate School of Education, and an M.A. in American Studies at Trinity College. He is currently completing his Ph.D. in education policy at the University of Connecticut Neag School of Education. Robert lives with his wife and son in the Forster Heights area of the Southwest neighborhood in Hartford. Views expressed in this blog are those of the author and do not necessarily reflect the official policy or position of Trinity College.