The Rise of Indiana Open Enrollment

February 26, 2018

(Guest Post by Matthew Ladner)

Ed Choice’s Drew Catt created this open enrollment map of Indiana. For those squinting at their iPhones, bright yellow signifies a district taking in 0-25 open enrollment students, while dark green denotes a district bringing in 501 to 1,680 open enrollment students.

So let’s contrast this with the Fordham Ohio open enrollment map:

The Fordham map denotes participation/non-participation by districts in open enrollment. Suburban non-participation jumps off the page of the Fordham map, so let’s contrast Indianapolis with Columbus. The Indiana map has a lot of green around Indianapolis, signifying open-enrollment participation by the suburbs.

Now let’s compare Indiana to the open enrollment data available from Arizona.

Much larger numbers in these Arizona districts, but also a broader definition of open-enrollment being utilized for the Arizona data that includes students transferring within district boundaries. Nevertheless, we know from a separate source that Scottsdale Unified has 4,000 students from outside of district boundaries, which is more than twice the number of any of the Indiana districts in the Ed Choice map.

So here is my provisional take, subject to your challenge in the comments: Indiana’s combined choice programs have coaxed the state out of the Ohio-like geographic segregation. Private choice program design may have contributed to this- Ohio’s voucher programs focus almost exclusively on urban students, while Indiana’s are more inclusive. Indiana has had the nation’s fastest growing voucher program in recent years. Although means-tested, Indiana’s private choice programs create empty seats in suburban districts more than is the case in the Ohio programs, which reach only suburban special education students.

The open enrollment boulder has been rolling down hill for a longer period of time in Arizona. Open-enrollment students outnumber charter students 2-1, and charter students outnumber private choice participants by 3-1. In other words, in Arizona school choice is being done primarily by school districts themselves. This of course did not happen exclusively through a process of spontaneous enlightenment whereby Arizona school districts threw down the drawbridge over the moat to welcome in thousands of out of district transfers out of the goodness of their hearts. Rather it was the product of incentives- hundreds of charter schools opening in suburbs and towns and a couple of decades of geographically inclusive private choice programs.

Charters and private choice do not deserve all the credit, as some suburban districts relatively unaffected nevertheless chose to participate in open enrollment. Chandler Unified for instance watched their enrollment grow by a third despite a large increase in charter schools and has been rocking academic growth to boot. I’m told that there is not a non-district charter in the Vail Unified district south of Tucson, but there are a many students from Tucson Unified. I doubt they are sweating choice much, but they have nevertheless chosen to participate, and Arizona’s students are the richer for it. Nevertheless, it seems self-evident that a main reason that Scottsdale Unified took in 4,000 students is due to the 9,000 students that live in the district boundaries and do not attend school in the district.

It may be no accident that the state with the highest access has also been leading in NAEP gains…

The defection of early open-enrollment adopters increases the pressure on other districts to participate, creating a virtuous cycle. I’m thrilled to see evidence of this in Indiana. The School Choice 1.0 failed urban students insomuch as it failed to unlock the suburbs. It’s time for the movement to embrace an inclusive “Social Justice Plus” strategy that aims to give urban students access to private, charter and suburban schools.


The Rising Tide in the Desert

February 21, 2018

(Guest Post by Matthew Ladner)


Too much of Arizona’s K-12 debate focuses on inputs, too little on outputs. Some districts have been gaining enrollment, others losing enrollment, but this is an entirely secondary concern when compared to the question of whether an increasing number of Arizona students are acquiring the knowledge, skills and habits for success in life.

The above chart from the Center for Student Achievement shows enrollment trends for districts and charters in a number of Arizona districts. Some districts gained students despite the rise of charters (Chandler, Higley, Queen Creek) while others lost enrollment. All of the above Arizona district/charter combos did well to spectacular in Stanford’s Sean F. Reardon’s measurement of academic growth, with the lone exception of Coolidge Unified.

The Scottsdale district/charter combo came in at the 64th percentile, Tucson at the 67th, Queen Creek at the 68th percentile, Deer Valley at the 83rd, Roosevelt at the 89th. All three of the truly spectacular scores (Higley, Chandler and Phoenix Elementary combos at 95, 95 and 99th percentile respectively) came from situations where both the district and charter sectors grew rapidly. Congratulations to the students and educators of these communities are richly deserved.

The growth party did not stop in 2015.  Here are the ELA proficiency trends in AZMerit for all of these districts:

and here is the same chart for math:

Some of these gains are large (see Queen Creek and Scottsdale) others incremental, but every single one of them is moving in the right direction.

Wait- my telepathic powers are picking something up. You were thinking “Ladner are you really going to celebrate Roosevelt going from 17% proficient to 23%?

I’m glad you asked.

Two things- first the AZMerit academic bar is high, and second most of the rest of the country seems mired in academic stagnation. Of course I’m not satisfied with 23% proficiency (#NoAZwe’vegottoWinMOAAARRR!) but I am indeed happy that both low performing and high performing districts show improvement.

Experience is a harsh mistress, and one of the things she teaches the policy analyst is to never rely solely on state test scores. NAEP will release 2017 scores in a few weeks. Let’s see what happens next. In the meantime, the freedom for families to choose between schools and the opportunity for Arizona educators to create new schools according to their vision of excellence seems to be broadly working.

District/Charter Combos that Outperform RSD

February 19, 2018

(Guest Post by Matthew Ladner)

Just out of curiosity, I decided to look into the Reardon data to see how many Arizona district/charter combos outperformed or tied the Recovery School District in academic growth. The above list is by no means exhaustive, more the product of throwing in some district names into the data base before my morning caffeine. The list is not short, and includes several very poor and isolated school districts.

Mind you that no one bombed any of these districts with millions in philanthropy. You can’t go to a bar in Snowflake Arizona and meet a group of six Teach for America students the way you can in the French Quarter. They also managed this somehow without a “harbor-master.” Finally, the Reardon data ends in 2015, and since then the trend in statewide data in Arizona has been positive, but in the RSD not so much. Hopefully Reardon will update his study so we can track this over time.

I want to be careful to note that I regard the Recovery School District to have been a very clever innovation for a district that had almost nothing to leverage but empty school buildings after a hurricane. If that hurricane however had leveled Houston or Dade County I’m afraid that the limited supply of philanthropic dollars and TFA teachers would have been unequal to the (much larger) task. In order to reach scale, we are going to need solutions that do without substantial infusions of outside money, as that is likely to be in increasing short supply.

Having said that, RSD landing in the 92nd percentile in growth in the Reardon data was truly a magnificent accomplishment. The leap however from “well done” to “everyone needs to do this now!!!!” looks very dangerous imo.


Tuscon versus Columbus: Round Two

February 16, 2018

(Guest Post by Matthew Ladner)

Yesterday I presented statewide NAEP information contrasting urban schooling achievement trends in Arizona and Ohio, and specifically in Tucson and Columbus. Columbus is surrounded by suburban districts choosing not to participate in open enrollment (typical I fear) while Tucson is surrounded by suburban districts who do participate in open enrollment-and actively so.

Today I remembered the cool data tool that the NYT developed using Sean F. Reardon’s data.

Let me start by saying that if I had to pick a district to showcase Arizona, it would not be Tucson. While I am fully aware of some outstanding schools in TUSD, the district’s reputation (fairly or not-I am no authority on the subject) usually involves enrollment decline, empty school buildings, union sway in school board elections and controversy over some sort of voluntary “La Raza” curriculum in the high schools. A decade ago you could peer into the state’s AIMS data and watch student cohorts fall further behind as they “progressed” through the system.

Arizona however has been leading the nation in academic gains, and Tucson continues to face steady and considerable competition for students not only from charter schools and private choice programs, but also from nearby suburban districts. It is my contention that this broad competition enables the bottom up accountability that results in Arizona’s average charter school closing after only 4 years despite receiving 15 year charters from the state. Reardon’s data includes both district and charter school trends, but how did Tucson fare between 2010 (3rd grade scores) and 2015 (8th grade scores) in terms of academic growth?

Tucson Unified (and charters operating within district boundaries) scored at the 64th percentile for growth during this period. Columbus Ohio meanwhile also had a charter school law active, but no suburban districts willing to allow transfers, per the Fordham map:

How did Columbus fare in the Reardon data?

Columbus scored in the 22nd percentile in academic growth during this period. The news is also grim in Cleveland, Toledo and Dayton although Cincinnati stands out as the Ohio urban progress champion during this period. Overall however things look like in NAEP for the two states.

Now if you want to see something really cool:

The east-west on these columns indicate the relative wealth of the district, and Phoenix Elementary and charters sit at the tip of the gains spear.

Eden and Burke on DCPS Fraud

February 13, 2018

Behold my BROOM ye mighty and DESPAIR!

(Guest Post by Matthew Ladner)

Jayblog readers of a certain tenure may recall the case being made here that outside of the DC Opportunity Scholarship Program and DC charter schools, there was little to celebrate for disadvantaged kids attending DCPS. Over the last decade of available NAEP data, it seemed clear that advantaged students were primarily driving the overall improvement in scores, with DC charters at least showing much larger rates of improvement for disadvantaged students compared to the national average. DCPS, not so much:

Well it turns out that my view of DCPS as being largely inept outside of educating advantaged kids in carefully guarded pockets of excellence was excessively benign: DCPS also developed a systemic approach to academic fraud.

Prosecutors Eden and Burke hit the pages of National Review yesterday to bring us up to speed on the various forms of metric-driven academic fraud recently uncovered in DCPS. DCPS has been engaging in systemic fraud in order to “improve” graduation rates. DCPS “improved” graduation rates by giving diplomas to huge numbers of ineligible students, and “improved” suspension rates by taking them off the books. The FBI is on the case. It’s not pretty. Money quote from Eden and Burke:

When former D.C. Public Schools chancellor Michelle Rhee assumed leadership, she had a searing critique, and a clear argument: Urban schools were paralyzed by collective-bargaining agreements and inertia, so the best path forward was to have expert-designed systems for a new generation of leaders to implement. The unions, in turn, warned that administrators would weaponize these new systems to force teachers to go along with dishonest schemes that would harm true education reform in the service of posting meaningless numerical improvements.

It turns out both sides had a point.

Meet the new boss, same as the old boss.



Brookings Hamilton Project to the Rescue on Charter Rankings

February 12, 2018

(Guest Post by Matthew Ladner)

Over the weekend I thought to myself- what if we just used the Hamilton Project’s Access map to rank state charter laws? The Hamilton map measures the percentage of students who have access to a charter school within their zip code. It’s not a perfect measure- some students after all have access to multiple charter schools within their zip codes and others nearby. The measure could be improved upon in theory, but let’s just run with it for a moment. What would a top 10 list look like?

  1. District of Columbia
  2. Arizona
  3. Utah
  4. Alaska
  5. Colorado
  6. New Mexico
  7. Florida
  8. Idaho
  9. Delaware
  10. Michigan

So a quick check finds only Alaska as a state with too few charter students to have made the NAEP sample in 2015. Alaska may be a bit of an anomaly due to the fact that half of the state’s population lives in a single city, meaning that a relatively small number of charter schools in a relatively small number of zip codes could cover a large percentage of the population in the Hamilton project.

So the Hamilton rankings have one state that has yet to produce enough charter students to make the NAEP sample in the top 10, while the National Alliance for Public Charter Schools ranking has six (Indiana, Alabama, Mississippi, Kentucky, Maine and Washington). There is some overlap between the lists (CO, DC and FL) but generally speaking the Hamilton list looks like flourishing charter sectors, while the NAPCS list is full of charter-light charter sectors.

Sector performance is an obsession of wonks, but is of limited significance to parents, who have every incentive to concern themselves more with the fit of individual schools for their child. Nevertheless, if we indulge the wonkiness for a moment, the Hamilton list looks pretty good on NAEP math- most having either high scores or high growth or both. Even number 10 ranked Michigan has this to hang their hat on:

I’ll take the actual Michigan charters over the largely unicorn charter schools of Alabama, Kentucky, Mississippi and Washington any day of the week and twice on Sunday.

Ziebarth Defends the Pageant

February 9, 2018

Miss Indiana crowned as Miss America

(Guest Post by Matthew Ladner)

Todd Ziebarth from the National Alliance for Public Charter Schools has responded to criticism from yours truly, Max Eden and others regarding the soundness of judging charter school laws based on adherence to a model bill, rather than by their results. I encourage you to read Todd’s response.

Ziebarth in essence claims that facts on the ground in the last five laws passed rather than flaws in the laws themselves have dampened the impact of otherwise good laws. I have no reason to doubt that differences in circumstances from state to state will influence speed out of the gate. I however do not share Ziebarth’s preference for ranking charter laws by their adherence to a model bill when it is possible to judge them by their results, like the Brookings Institute did in this map:

This map measures the percentage of students by state who have access to a charter school in their zip code. It’s not a perfect measure- after all some zip codes have multiple charter schools. Perhaps the measure could be improved upon. When however you see states with near zero percentages on this map near the top of a ranking list, something seems out of sorts with the rankings. Yes circumstances can influence how well you come out of the gate, five new laws in a row failing to produce many schools isn’t a fluke, it looks more like a pattern.

Ziebarth notes that if we don’t include the recent charter bills that have yet to produce many charters, then you get a list like (each state listed along with the % of charter students). This revised list however remains problematic.

1 Indiana 4%
2 Colorado 13%
3 Minnesota 6%
4 District of Columbia 46%
5 Florida 10%
6 Nevada 8%
7 Louisiana 11%
8 Massachusetts 4%
9 New York 5%
10 Arizona 17%

Ok, so the top rated law (Indiana) only produced charter schools within the zip codes of 19.5% of Indiana students, and enrolls 4% of the student population. The law has been in operation for a long time, but you as yet cannot even get a NAEP score for their schools because of the wee-tiny size of the population. If one is a utilitarian sort, any set of criteria that ranks Indiana as having the top charter school law seems in need of revision.

Minnesota has the oldest of all charter school laws, but only six percent of the kids, and 37.7% of kids having access to a charter in their zip code for a law that passed in 1991. There is a word for that: contained. Minnesota gets a ton of credit for inventing charter schools, but their law doesn’t seem to be doing a whole lot to provide families with opportunities, or producing competitive pressure to shake things up.

DC meanwhile has 46% of total kids and 87% of kids have access to a charter school in their zip code. It’s also easy to find evidence of academic success for DC charters. Judging by results, this certainly looks like a much better charter law than Indiana or Minnesota. Ironically, the main reason NAPCS dings the DC charter law in their scoring metric is for a lack of equitable funding. DC charters however seem to be funded at a high enough level to capture 46% of the market, to provide access to 87% of kids, and to produce better results than DCPS. They also receive more generous funding per pupil than most (all?) states. There is no contest between DC and either Indiana or Minnesota in terms of outcomes in my book.

Ok I could go on but I think the horse is dead. We’ve reached the point where it is possible to judge charter sectors by outcomes, rather than by a model bill beauty pageant criteria.

In the end charter school laws either produce seats or they don’t. Laws that fail to produce seats are failures. Laws that produce only a few seats are disappointments. Philanthropists should carefully reexamine their grant metrics to guard against the possibility that they have created a powerful incentive for groups to seek the passage of charter laws regardless of whether they ever produce many charter seats. I haven’t seen grant agreements, but I have watched as the last five laws failed to produce many schools. We are supposed to be creating meaningful opportunity for kids rather than merely colored maps.