Data Secrecy Violating Data Democracy in DC Public Schools

Please follow and like us:

The District of Columbia Public Schools (DCPS) is soon to vote on yet another dramatic new educational policy that, as described in an email/letter to all members of the American Federation of Teachers (AFT) by AFT President Randi Weingarten, “would make it impossible for educators, parents and the general public to judge whether some of DCPS’ core instructional strategies and policies are really helping District children succeed.”

As per Weingarten: “Over a year ago, the Washington [DC] Teachers’ Union filed a Freedom of Information Act (FOIA) request to see the data from the school district’s IMPACT [teacher] evaluation system—a system that’s used for big choices, like the firing of 563 teachers in just the past four years, curriculum decisions, school closures and more [see prior posts about this as related to the IMPACT program here]. The FOIA request was filed because DCPS refused to provide the data….[data that are]…essential to understanding and addressing the DCPS policies and practices that impact” teachers and education in general.

Not only are such data crucial to build understandings, as noted, but they are also crucial in support of a functioning democracy, to allow others within a population concerned with a public institution test the mandates and policies they collectively support, in theory or concept (perhaps) but also via public taxes.

Regardless, soon after the DC union filed the FOIA, DCPS (retaliated, perhaps, and) began looking to override FOIA laws through “a radical new secrecy provision to hide the information that’s being used to make big decisions” like those associated with the aforementioned IMPACT teacher evaluation system.

Sound familiar? See prior posts about other extreme governmental moves in the name of secrecy, or rather educational policies at all costs, namely in New Mexico here and here.

You can send a letter to those in D.C. to vote NO on their “Educator Evaluation Data Protection” provisions by clicking here.

As per another post on this topic, in GFBrandenburg’s Blog — that is “Just a blog by a guy who’s a retired math teacher” — Brandenburg did leak some of the data now deemed “secret.” Namely, he “was leaked,” by an undisclosed source, “the 2009-10 IMPACT sub-scores from the Value-Added Monstrosity (VAM) nonsense and the Teaching and Learning Framework (TLF), with the names removed. [He] plotted the two [sets of] scores and showed that the correlation was very, very low, in fact about 0.13 [r-squared=0.33], or nearly random, as you [can] see here:”

vam-vs-tlf-dc-2009-10

In the world of correlation, this is atrocious, IF high-stakes (e.g., teacher termination, tenure, merit pay) are to be attached to such output. No wonder DCPS does not want people checking in to see if that which they are selling is true to what is being sold.

In Brandenburg’s words: “Value-Added scores for any given teacher jumped around like crazy from year to year. For all practical purposes, there is no reliability or consistency to VAM whatsoever. Not even for elementary teachers who teach both English and math to the same group of children and are ‘awarded’ a VAM score in both subjects. Nor for teachers who taught, say, both 7th and 8th grade students in, say, math, and were ‘awarded’ VAM scores for both grade levels: it’s as if someone was to throw darts at a large chart, blindfolded, and wherever the dart lands, that’s your score.”

9 thoughts on “Data Secrecy Violating Data Democracy in DC Public Schools

  1. Ms. Amrein-Beardsley, I’m glad to see you are such a big supporter of FOIA and transparent data. Maybe you will sign on to my case against VDOE in having them release the SGP data to the public.

    Actually, I did receive some SGP data with the teachers’ names removed. Contrary to what you claim the DC data showed, it was highly reliable. A teacher whose median SGP score was in the bottom 20% in year 1 was:

    10x more likely to remain in the bottom 20% in year 2 than to move into the top 20%

    more likely to remain in the bottom 20% in year 2 than to move into any of the other 4 quintiles

    – a teacher in the top 20% had the same tendencies to remain in the top 20%

    I agree that we should publish the reliability of the data. Won’t you advocate for the release of all of the data so the public can decide?

    • And what will you do, SGP, if the DC data doesn’t show what you want it to show? All this time you’ve been claiming that VAM is stable overall based on what – YOUR analysis of one county’s VAM scores?

      Maybe you and this author should analyze each other’s data from the other’s jurisdictions. That would make for an interesting post.

    • Reply to Virg: How about if you send “your” data over to Amrein-Beardsley so she can analyze it independently ?

      • We actually have a study forthcoming with SGPs from throughout AZ — multiple districts, multiple years, multiple teachers. Will post on that when permitted. In the meantime, I’m happy to support any free/open data initiatives. I’m also open to analyzing any data, assuming the sample is large enough, comprehensive, legit (e.g., in terms of decision-rules), etc.

  2. Has SGP actually looked at the NYC data that was published by various NYC newspapers? I have. I wasits linked on my blog — you can follow be it by putting the terms NYV VAM in the search field.
    I looked at the data in many ways and found a striking LACK of consistency in those value-added scores for any one teacher, from year to year, and from subject to subject (say for a grade 5 teacher scored on both reading and math) or from grade to grade (if they got scored, say, on both 7th and 8th grade math). My data is open for analysis, If I’m wrong, please show me where I erred, and stop just saying Raj a hefty showed such and such with data we aren’t allowed to see.erred.

  3. By the way, Audrey, I think R and R-squared got switched. R is about 0.36 and R-squared is about 0.13. This is the correlation between classroom observation scores on an allegedly scientific rubric and value-added scores.
    It’s onvert similar to other published results.
    Would SGP care to refute those?
    Those results, by the way, cast tremendous doubt on both value-added messes and those observational rubrics.

  4. They are probably hiding more data than VAM. The graph for this post at plots RANKS on VAM and RANKS on the DPS Teacher Learning Framework a 41 page protocol for evaluating teachers. The TEACH part of this protocol calls for judgments at four levels of performance on nine teaching standards with 42 detailed criteria, based on classroom observations by an evaluator. In effect, the evaluator is asked to make 168 specific judgments for each evaluative session. A teacher who fails to comply with the 42 detailed criteria is placed into one of five reductive categories ranging from highly effective to ineffective. In addition to not having a clear system for weighting all of these observation-based criteria, the overall evaluation scheme calls for different frequencies of evaluations, from 2 to 8 per year, depending on a teacher’s career classification. There are five career classifications and theses are not stable. They are contingent on performance measures form the prior two years. So the graph offers data points all over the place with two variables not fully described here, and from a complex evaluation system with multiple criteria, no clear description of how the 42 criteria are weighted other than by descriptive rubrics–a system known to be unreliable and and with no measure of validity other than test scores churned through VAM. The DC observation protocol requires teachers to engage in practices long associated with direct instruction, mastery of easy to define bits and chunks of information with students demonstrating MASTERY by being able to state the”learning objective(s) for the lesson and what they learned. I could not find any references for the reliability and validity of the components of this evaluation scheme. All I found was a list of documents consulted in developing the criteria in the Teaching Framework.

Leave a Reply

Your email address will not be published. Required fields are marked *