District/Charter Combos that Outperform RSD

February 19, 2018

(Guest Post by Matthew Ladner)

Just out of curiosity, I decided to look into the Reardon data to see how many Arizona district/charter combos outperformed or tied the Recovery School District in academic growth. The above list is by no means exhaustive, more the product of throwing in some district names into the data base before my morning caffeine. The list is not short, and includes several very poor and isolated school districts.

Mind you that no one bombed any of these districts with millions in philanthropy. You can’t go to a bar in Snowflake Arizona and meet a group of six Teach for America students the way you can in the French Quarter. They also managed this somehow without a “harbor-master.” Finally, the Reardon data ends in 2015, and since then the trend in statewide data in Arizona has been positive, but in the RSD not so much. Hopefully Reardon will update his study so we can track this over time.

I want to be careful to note that I regard the Recovery School District to have been a very clever innovation for a district that had almost nothing to leverage but empty school buildings after a hurricane. If that hurricane however had leveled Houston or Dade County I’m afraid that the limited supply of philanthropic dollars and TFA teachers would have been unequal to the (much larger) task. In order to reach scale, we are going to need solutions that do without substantial infusions of outside money, as that is likely to be in increasing short supply.

Having said that, RSD landing in the 92nd percentile in growth in the Reardon data was truly a magnificent accomplishment. The leap however from “well done” to “everyone needs to do this now!!!!” looks very dangerous imo.

 


Tuscon versus Columbus: Round Two

February 16, 2018

(Guest Post by Matthew Ladner)

Yesterday I presented statewide NAEP information contrasting urban schooling achievement trends in Arizona and Ohio, and specifically in Tucson and Columbus. Columbus is surrounded by suburban districts choosing not to participate in open enrollment (typical I fear) while Tucson is surrounded by suburban districts who do participate in open enrollment-and actively so.

Today I remembered the cool data tool that the NYT developed using Sean F. Reardon’s data.

Let me start by saying that if I had to pick a district to showcase Arizona, it would not be Tucson. While I am fully aware of some outstanding schools in TUSD, the district’s reputation (fairly or not-I am no authority on the subject) usually involves enrollment decline, empty school buildings, union sway in school board elections and controversy over some sort of voluntary “La Raza” curriculum in the high schools. A decade ago you could peer into the state’s AIMS data and watch student cohorts fall further behind as they “progressed” through the system.

Arizona however has been leading the nation in academic gains, and Tucson continues to face steady and considerable competition for students not only from charter schools and private choice programs, but also from nearby suburban districts. It is my contention that this broad competition enables the bottom up accountability that results in Arizona’s average charter school closing after only 4 years despite receiving 15 year charters from the state. Reardon’s data includes both district and charter school trends, but how did Tucson fare between 2010 (3rd grade scores) and 2015 (8th grade scores) in terms of academic growth?

Tucson Unified (and charters operating within district boundaries) scored at the 64th percentile for growth during this period. Columbus Ohio meanwhile also had a charter school law active, but no suburban districts willing to allow transfers, per the Fordham map:

How did Columbus fare in the Reardon data?

Columbus scored in the 22nd percentile in academic growth during this period. The news is also grim in Cleveland, Toledo and Dayton although Cincinnati stands out as the Ohio urban progress champion during this period. Overall however things look like in NAEP for the two states.

Now if you want to see something really cool:

The east-west on these columns indicate the relative wealth of the district, and Phoenix Elementary and charters sit at the tip of the gains spear.


Gender Gaps in College STEM Education: Boys Tend to be Over-Confident in Math and Benefit from It

February 15, 2018

Related image

(Guest Post by Gema Zamarro & Lina M. Anaya)

Employment in the so-called STEM (Science, Technology, Engineering and Mathematics) fields is projected to continue growing according to the Bureau of Labor Statistics. Additionally, wages in STEM field occupations are estimated to be on average nearly double the national average of wages for non-STEM jobs. Despite this promising future, women continue to be under-represented in STEM. Women are less likely to enroll in STEM degrees in college and represent a smaller share of STEM occupations. The question is why? Only after understanding the possible sources of such gender gaps we can have an idea of what can be done about it.

This question has haunted me (Gema) since my daughter, then a kindergartener, came home one day saying a boy in her class told her “girls are not good in math.”  Indeed, researchers have pointed out at gender differences in math performance and math perceived ability as possible drivers of later gender gaps in STEM. I wondered if parents could somehow counter these effects. After all, my previous work indicated that parental occupation type could be important for women’s long term STEM outcomes. In a recent working paper, I partnered with Lina M. Anaya, a Ph.D. student at the Department of Education Reform at the University of Arkansas, and Frank Stafford, Economics professor at the University of Michigan, to try and shed some light on these questions, using data from the Panel Study of Income Dynamics (PSID).

Using information from the multiple supplements of the PSID, we measured gender gaps in performance  on the standardized Woodcock  Johnson Applied problems test (W-J AP) and self-reported perceived math ability, measured on children from PSID families when they were between 6 and 17 years old (on average around 11 years old). Then, we are able to track these children and study their likelihood of majoring in a STEM field in college. We found that boys are more confident in their abilities than is warranted by their performance, while girls are less confident than is warranted by their performance.  But the problem isn’t just one of lack of confidence — boys’ confidence contributes more to their pursuit of STEM majors than girls’ confidence, even if they had the same true ability and same level of confidence.

Our results corroborated significant gender differences in W-J AP test performance and in perceived math ability during childhood. Even after conditioning in a given level of math performance in the W-J AP test, girls reported significantly lower levels of perceived math ability than boys (See Table 1). In the highest percentiles of math performance, 64% of boys reported the highest levels of perceived math ability, as compared to 50% of the girls. Even in the lowest levels of math performance, boys tended to be more optimistic with respect to their math ability, 29% of the boys reported the highest levels of perceived math ability, relative to 17% of the girls doing so. Having a parent with an occupation in STEM helped increase math performance but did not seem to help improve perceived math ability, if anything it seemed that those with parents in science were more pessimistic.

Table 1: Perceived Math Ability by Gender, given W-J AP scores (% of sample)

    Perceived Math Ability
W-J AP (percentile) Gender 1 to 3 4 to 5 6 to 7
0-50 Boys 15.9% 55.1% 29.0%
Girls 18.7% 64.2% 17.1%
51-80 Boys 4.1% 44.3% 51.6%
Girls 6.5% 49.9% 43.6%
81-100 Boys 2.8% 32.7% 64.4%
Girls 4.6% 45.2% 50.2%

Note: Weighted percentages reported using child population weights

Interestingly, girls’ lack of perceived ability seems to be something specific to math and not the result of girls generally reporting lower levels of perceived ability. The PSID also included results in the Woodcock Johnson reading test (W-J reading) and asked kids to report on their perceived ability in reading. We use this information to study perceived ability in reading conditional on performance. As it can be seen in the results in Table 2, gender patterns are very different for reading, a subject where girls, on average, outperform boys. In this case, we observe smaller gender differences of perceived reading ability among those scoring in the higher percentiles of the W-J reading test while girls performing in the lower percentiles report higher levels of perceived ability than boys.

Table 2: Perceived Reading Ability by Gender, given W-J reading scores (% of sample)

    Perceived Reading Ability
W-J Reading (percentile) Gender 1 to 3 4 to 5 6 to 7
0-50 Boys 13.4% 57.8% 28.8%
Girls 9.2% 50.1% 40.7%
51-80 Boys 4.3% 47.1% 48.6%
Girls 3.4% 32.4% 64.1%
81-100 Boys 1.9% 29.5% 68.6%
Girls 1.5% 33.5% 65.0%

Note: Weighted percentages reported using child population weights

Finally, since the PSID tracked these kids, we study to what extent math performance and perceived math ability, during childhood, and parental occupation type are related to the probability of majoring in STEM during college. Overall, as expected, we find that women are less likely to major in STEM in our sample, especially when we look at the so called “hard sciences” fields of engineering, architecture, mathematics and computer sciences. Both higher levels of math performance in the W-J AP test and higher levels of perceived math ability are related to higher probabilities of majoring in a STEM field.

But, here is where it gets interesting, the effects of higher levels of math performance and perceived ability are much bigger for boys than for girls. Performing in the highest percentiles of the W-J AP distribution, as compared to performing in the lowest percentiles, is associated with an increase in the probability of majoring in a “hard sciences” STEM field of about 13 percentage points for boys but only 6 percentage points for girls. Similarly, reporting the highest levels of perceived math ability, as compared to the lowest levels, is associated with an increase in the probability of majoring in a “hard sciences” field of about 7 percentage points for boys but only 2 percentage points for girls. These results suggest a loss of STEM enrollment by otherwise capable women.  And we can’t simply fix the problem by trying to boost women’s confidence in their true abilities, because women’s confidence contributes less to pursuing STEM than men’s confidence.  Perhaps men are rewarded for over-confidence in a way that women are not.

Interestingly, having a parent who works in a STEM occupation could help girls and not so much boys. The probability of majoring in “hard sciences” STEM fields increases by about 14 percentage points for girls when one of the parents works in a science job. For boys the increase of this probability is only 4 percentage points. Whatever the reason, these results suggest that parental occupation type could be an important factor reducing gender differences.

As for the answer I gave to my daughter, I said “It is not true that girls are bad at math. Look at your mother. My job is doing math all day!” I work on the field of applied econometrics and so, I guess that was close enough.


Tucson Arizona versus Columbus Ohio

February 15, 2018

(Guest Post by Matthew Ladner)

Large urban districts in Arizona are surrounded by suburban districts accepting transfers through open enrollment. I fear that Arizona is an outlier in this regard, and that the rest of the country is more like Ohio than Arizona. Fordham produced this deeply revealing open enrollment map of Ohio, showing every major urban center to be surrounded by districts who do not participate in open enrollment. Non-participating districts shaded in dark blue:

Now you will recall a similar map of Arizona, with districts not participating in open enrollment again marked in deep blue:

arizona-state-usa-solid-black-outline-map-of-vector-24335079

I believe that open enrollment is a big reason that Arizona has been leading the nation in NAEP gains, and that charter and private choice programs deserve some credit the eagerness with which districts participate. Take a look at Columbus on the above map- a large urban district literally surrounded by districts choosing not to allow open enrollment transfers. Now take a look at the school district map of Pima County. The Tucson Unified School District is surrounded by districts that do participate in open-enrollment- actively.

Tucson is a part of the nation’s second fastest growing state, but Tucson Unified has experienced a steady decline in enrollment. This is in part due to the rise of charter schools- as documented by the Center for Student Achievement:

Open enrollment

Several of the districts in the chart above gained enrollment despite the increase in charter school enrollment-Queen Creek, Higley, Chandler and Phoenix Elementary. Notice also that these districts, which run a gamut between suburban and urban Arizona, all have growing charter school sectors.

Urban students in Arizona have the opportunity to attend suburban district schools, while their peers in Ohio (and much of the rest of the country) do not. We sadly do not as yet have district by district data on open-enrollment, but research by a Yale student put the figure at almost a third of district K-8 enrollment in Phoenix area districts had utilized open enrollment. We know for instance that Scottsdale Unified has 4,000 students attending through open enrollment. Anecdotally we know that several of the Tucson area school districts are also very active in open enrollment.

Arizona’s urban students have the opportunity to attend suburban schools, and Ohio’s urban students do not. This is primarily in my view because Arizona charter schools have helped open suburban seats, while Ohio’s choice programs have been overwhelmingly focused on urban students. So let’s check NAEP trends for large city students for all six exams for the entire period with state level data:

I’m confident I know what is going right for Arizona’s students in large cities: opportunity. They have the opportunity to attend their home district, suburban districts, charter schools (lots of them) and private schools. Tucson did not participate in TUDA, but does show positive trends in the state’s AZMerit data. Tucson’s enrollment is declining, but scores are improving and that is without factoring in the scores of kids attending suburban district schools, charter schools or private schools with scholarship assistance.

I’m not nearly as confident that I understand what is going wrong for urban students in Ohio, but this:

…is not working for them at all.


Public Service Announcement

February 14, 2018

(Guest post by Greg Forster)

Citizens! You are hereby notified that Mary and the Witch’s Flower, which I reviewed here, is returning to US theaters for two days only next week, due to overwhelming demand.

If you missed it the first time, attendance is mandatory. Otherwise, attendance is merely meritorious.

A theater near me has brought back Darkest Hour, the surprise hit of the Oscar noms – perhaps one near you has done so as well. As I said before, you should see that one on the big screen, too!

End transmission.


Eden and Burke on DCPS Fraud

February 13, 2018

Behold my BROOM ye mighty and DESPAIR!

(Guest Post by Matthew Ladner)

Jayblog readers of a certain tenure may recall the case being made here that outside of the DC Opportunity Scholarship Program and DC charter schools, there was little to celebrate for disadvantaged kids attending DCPS. Over the last decade of available NAEP data, it seemed clear that advantaged students were primarily driving the overall improvement in scores, with DC charters at least showing much larger rates of improvement for disadvantaged students compared to the national average. DCPS, not so much:

Well it turns out that my view of DCPS as being largely inept outside of educating advantaged kids in carefully guarded pockets of excellence was excessively benign: DCPS also developed a systemic approach to academic fraud.

Prosecutors Eden and Burke hit the pages of National Review yesterday to bring us up to speed on the various forms of metric-driven academic fraud recently uncovered in DCPS. DCPS has been engaging in systemic fraud in order to “improve” graduation rates. DCPS “improved” graduation rates by giving diplomas to huge numbers of ineligible students, and “improved” suspension rates by taking them off the books. The FBI is on the case. It’s not pretty. Money quote from Eden and Burke:

When former D.C. Public Schools chancellor Michelle Rhee assumed leadership, she had a searing critique, and a clear argument: Urban schools were paralyzed by collective-bargaining agreements and inertia, so the best path forward was to have expert-designed systems for a new generation of leaders to implement. The unions, in turn, warned that administrators would weaponize these new systems to force teachers to go along with dishonest schemes that would harm true education reform in the service of posting meaningless numerical improvements.

It turns out both sides had a point.

Meet the new boss, same as the old boss.

 

 


Brookings Hamilton Project to the Rescue on Charter Rankings

February 12, 2018

(Guest Post by Matthew Ladner)

Over the weekend I thought to myself- what if we just used the Hamilton Project’s Access map to rank state charter laws? The Hamilton map measures the percentage of students who have access to a charter school within their zip code. It’s not a perfect measure- some students after all have access to multiple charter schools within their zip codes and others nearby. The measure could be improved upon in theory, but let’s just run with it for a moment. What would a top 10 list look like?

  1. District of Columbia
  2. Arizona
  3. Utah
  4. Alaska
  5. Colorado
  6. New Mexico
  7. Florida
  8. Idaho
  9. Delaware
  10. Michigan

So a quick check finds only Alaska as a state with too few charter students to have made the NAEP sample in 2015. Alaska may be a bit of an anomaly due to the fact that half of the state’s population lives in a single city, meaning that a relatively small number of charter schools in a relatively small number of zip codes could cover a large percentage of the population in the Hamilton project.

So the Hamilton rankings have one state that has yet to produce enough charter students to make the NAEP sample in the top 10, while the National Alliance for Public Charter Schools ranking has six (Indiana, Alabama, Mississippi, Kentucky, Maine and Washington). There is some overlap between the lists (CO, DC and FL) but generally speaking the Hamilton list looks like flourishing charter sectors, while the NAPCS list is full of charter-light charter sectors.

Sector performance is an obsession of wonks, but is of limited significance to parents, who have every incentive to concern themselves more with the fit of individual schools for their child. Nevertheless, if we indulge the wonkiness for a moment, the Hamilton list looks pretty good on NAEP math- most having either high scores or high growth or both. Even number 10 ranked Michigan has this to hang their hat on:

I’ll take the actual Michigan charters over the largely unicorn charter schools of Alabama, Kentucky, Mississippi and Washington any day of the week and twice on Sunday.


Ziebarth Defends the Pageant

February 9, 2018

Miss Indiana crowned as Miss America

(Guest Post by Matthew Ladner)

Todd Ziebarth from the National Alliance for Public Charter Schools has responded to criticism from yours truly, Max Eden and others regarding the soundness of judging charter school laws based on adherence to a model bill, rather than by their results. I encourage you to read Todd’s response.

Ziebarth in essence claims that facts on the ground in the last five laws passed rather than flaws in the laws themselves have dampened the impact of otherwise good laws. I have no reason to doubt that differences in circumstances from state to state will influence speed out of the gate. I however do not share Ziebarth’s preference for ranking charter laws by their adherence to a model bill when it is possible to judge them by their results, like the Brookings Institute did in this map:

This map measures the percentage of students by state who have access to a charter school in their zip code. It’s not a perfect measure- after all some zip codes have multiple charter schools. Perhaps the measure could be improved upon. When however you see states with near zero percentages on this map near the top of a ranking list, something seems out of sorts with the rankings. Yes circumstances can influence how well you come out of the gate, five new laws in a row failing to produce many schools isn’t a fluke, it looks more like a pattern.

Ziebarth notes that if we don’t include the recent charter bills that have yet to produce many charters, then you get a list like (each state listed along with the % of charter students). This revised list however remains problematic.

1 Indiana 4%
2 Colorado 13%
3 Minnesota 6%
4 District of Columbia 46%
5 Florida 10%
6 Nevada 8%
7 Louisiana 11%
8 Massachusetts 4%
9 New York 5%
10 Arizona 17%

Ok, so the top rated law (Indiana) only produced charter schools within the zip codes of 19.5% of Indiana students, and enrolls 4% of the student population. The law has been in operation for a long time, but you as yet cannot even get a NAEP score for their schools because of the wee-tiny size of the population. If one is a utilitarian sort, any set of criteria that ranks Indiana as having the top charter school law seems in need of revision.

Minnesota has the oldest of all charter school laws, but only six percent of the kids, and 37.7% of kids having access to a charter in their zip code for a law that passed in 1991. There is a word for that: contained. Minnesota gets a ton of credit for inventing charter schools, but their law doesn’t seem to be doing a whole lot to provide families with opportunities, or producing competitive pressure to shake things up.

DC meanwhile has 46% of total kids and 87% of kids have access to a charter school in their zip code. It’s also easy to find evidence of academic success for DC charters. Judging by results, this certainly looks like a much better charter law than Indiana or Minnesota. Ironically, the main reason NAPCS dings the DC charter law in their scoring metric is for a lack of equitable funding. DC charters however seem to be funded at a high enough level to capture 46% of the market, to provide access to 87% of kids, and to produce better results than DCPS. They also receive more generous funding per pupil than most (all?) states. There is no contest between DC and either Indiana or Minnesota in terms of outcomes in my book.

Ok I could go on but I think the horse is dead. We’ve reached the point where it is possible to judge charter sectors by outcomes, rather than by a model bill beauty pageant criteria.

In the end charter school laws either produce seats or they don’t. Laws that fail to produce seats are failures. Laws that produce only a few seats are disappointments. Philanthropists should carefully reexamine their grant metrics to guard against the possibility that they have created a powerful incentive for groups to seek the passage of charter laws regardless of whether they ever produce many charter seats. I haven’t seen grant agreements, but I have watched as the last five laws failed to produce many schools. We are supposed to be creating meaningful opportunity for kids rather than merely colored maps.


The Legacy of Andrew Coulson

February 8, 2018

Screen Shot 2018-02-08 at 11.57.16 AM

(Guest Post by Jason Bedrick)

Yesterday marked the second anniversary of the tragically early passing of Andrew J. Coulson, the brilliant and (in the words of his beloved wife, Kay) “happy, effusive, relentlessly upbeat” education reformer, policy analyst, and director of the Cato Institute’s Center for Educational Freedom.

IMHO, the best tribute we can pay to Andrew is to reflect on his ideas. Although he didn’t live to see it, PBS ran his magnum opus documentary, School, Inc., about how and why our education system lacks the progress, innovation, and efficiency gains seen in nearly every other industry. Last year, the Friedmanesque three-part series won the Anthem Film Festival’s award for Excellence in Filmmaking – Documentary Feature, and now Free to Choose Media is making the documentary available to view for free online.

The Cato Institute has also made Educational Freedom: Remembering Andrew Coulson, Debating His Ideas, available to download as a free e-book.

Andrew’s voice is greatly missed in today’s debates over education policy, but as Neal McCluskey wrote, “Thankfully, his ideas remain, and they will always illuminate the pathway forward.”

 

NOTE: This post has been updated to clarify that it is Free to Choose Media that is making School, Inc. available to watch free online.


A Brief History of NAEP Cohort Math Gains-The Low Hanging Fruit Already Picked

February 8, 2018

(Guest Post by Matthew Ladner)

The 2017 NAEP is due to be released in a few weeks, so I thought it would be a good time to review a brief history of where we’ve been. The above table lists all of the available Math cohort gains by jurisdiction for the entire period all states have been giving NAEP. These cohort gains are calculated by subtracting the 4th grade scores of a cohort of students from their 8th grade scores. NAEP math and reading tests were specifically scaled and timed in such a way to allow for such comparisons.

Now…just a minute…stop staring at your state’s results and pay attention…oh okay fine go stare at your state’s results and then come back.

Right, now that you are done with that, allow me to draw your attention to the AVERAGE row at the bottom. This is a simple average between states, and it appears to be in slow but steady decline. Notice for instance Maryland’s transformation from a reform super-hero to a state that appeared to forget to teach mathematics to kids in 6th grade. Notice that the top gains from the 2009-2013 and 2011-2015 periods (Arizona) would have not been the top gainers in the golden age of 2003-2007. Arizona winds up coming top in recent years because they remained consistently pretty high while other states declined.

It should be noted that factors other than the quality of instruction could be at play here. For instance, inclusion rates for students with disabilities and ELL learners may have varied over time, creating the appearance of a decline. To test this, the below table runs the same math cohort gains but this time only for general education students:

Overall the story does not change a great deal- we still see a declining trend, and Maryland forgot to teach math to both general ed students about as much as everyone else. I will also note that Arizona owes its status as the math gains champ for 2009-13 and 2011-15 to gains among special education and/or ELL students, which as someone who worked on choice programs for special needs students in Arizona for a decade and a half, warms my heart:

My guess is that reformers picked the low-hanging fruit of education reform in the early aughts. The introduction of standards and testing in the early days seems to have produced a bump in achievement. Over time however this effect may be fading. Political Science 101 teaches that organized interests defeat diffuse interests 99 times out of a hundred, so the ability of states to employ a cat o’ nine tails and whip schools into improvement has limits. Dozens of decisions taken daily in the musty basements of State Departments of Education and obscure measures voted on by State Boards of Education can slowly but surely defang and/or subvert state accountability systems.

If there are two things that the organized employee interests of adults working in schools are expert at it is passive resistance and bureaucratic infighting. In my book, much of the reform crowd chose to fight their opponents on ground they did not choose wisely, and upon which they have little chance to prevail. Things fall apart, the center cannot hold.

Mike Petrilli recently and correctly imo noted that the 2017 NAEP would be a pretty definitive test on the efficacy of the Obama year projects- promoting Common Core and teacher evaluation, student discipline reform. Top down directives have a funny way of not working out, even backfiring. Let’s see what happens next.