Is this Thing On? Amplifying the Call to Stop the Use of Test Data for Educator Evaluations (At Least for Now)

Please follow and like us:

I invited a colleague of mine and now member of the VAMboozled! team – Kimberly Kappler Hewitt (Assistant Professor, University of North Carolina, Greensboro) – to write another guest post for you all (see her first post here). She wrote another, this time capturing what three leading professional organizations have to say on the use of VAMs and tests in general for purposes of teacher accountability. Here’s what she wrote:

Within the last year, three influential organizations—reflecting researchers, practitioners, and philanthropic sectors—have called for a moratorium on the current use of student test score data for educator evaluations, including the use of value-added models (VAMs).

In April of 2014, the American Statistical Association (ASA) released a position statement that was highly skeptical of the use of VAMs for educator evaluation. ASA declared that “Attaching too much importance to a single item of quantitative information is counterproductive—in fact, it can be detrimental to the goal of improving quality.” To be clear, the ASA stopped short of outright condemning the use of VAM for educator evaluation, and declared that its statement was designed to provide guidance, not prescription. Instead, ASA outlined the possibilities and limitations of VAM and called into question how it is currently being (mis)used for educator evaluation.

In June of 2014, the Gates Foundation, the largest American philanthropic education funder, released “A Letter to Our Partners: Let’s Give Students and Teachers Time.” This was written by Vicki Phillips, Director of Education, College Ready, in which she (on behalf of the Foundation) called for a two-year moratorium on the use of test scores for educator evaluation. She explained that “teachers need time to develop lessons, receive more training, get used to the new tests, and offer their feedback.”

Similarly, the Association for Supervision and Curriculum Development (ASCD), which is arguably the leading international educator organization comprised of 125,000 members in more than 130 nations, also recently released a policy brief that also calls for a two-year moratorium on high stakes use of state tests—including their use for educator evaluations. ASCD also explicitly acknowledged that “reliance on high-stakes standardized tests to evaluate students, educators, or schools is antithetical to a whole child education. It is also counter to what constitutes good educational practice.”

While the call to halt the current use of test scores for educator evaluation is echoed across all three of these organizations, there are important nuances to their messages. The Gates Foundation, for example, makes it clear that the foundation supports the use of student test data for educator evaluation even as it declares the need for a two-year moratorium, the purpose of which is to allow teachers the time to adjust to the new Common Core Standards and related tests:

The Gates Foundation is an ardent supporter of fair teacher feedback and evaluation systems that include measures of student gains. We don’t believe student assessments should ever be the sole measure of teaching performance, but evidence of a teacher’s impact on student learning should be part of a balanced evaluation that helps all teachers learn and improve.

The Gates Foundation cautions, though, the risk of moving too quickly to tie test scores to teacher evaluation:

Applying assessment scores to evaluations before these pieces are developed would be like measuring the speed of a runner based on her time—without knowing how far she ran, what obstacles were in the way, or whether the stopwatch worked!

I wonder what the stopwatch symbolizes in the simile: Does the Gates Foundation have questions about the measurement mechanism itself (VAM or another student growth measure), or is Gates simply arguing for more time in order for educators to be “ready” for the race they are expected to run?

While the Gates call for a moratorium is oriented on increasing the possibility of realizing the positive potential of policies regarding the use of student test data for educator evaluation by providing more time to prepare educators for them, ASA on the other hand is concerned about the potential negative effects of such policies. The ASA, in its attempt to provide guidance, identified problems with the current use of VAM for educator evaluation and raised important questions about the potential effects of high stakes use of VAM for educator evaluation:

A decision to use VAMs for teacher evaluations might change the way the tests are viewed and lead to changes in the school environment. For example, more classroom time might be spent on test preparation and on specific content from the test at the exclusion of content that may lead to better long-term learning gains or motivation for students. Certain schools may be hard to staff if there is a perception that it is harder for teachers to achieve good VAM scores when working in them. Over-reliance on VAM scores may foster a competitive environment, discouraging collaboration and efforts to improve the educational system as a whole.

Similarly to ASA, ASCD is concerned with the negative effects of current accountability practices, including “over testing, a narrowing of the curriculum, and a de-emphasis of untested subjects and concepts—the arts, civics, and social and emotional skills, among many others.” While ASCD is clear that it is not calling for a moratorium on testing, it is calling for a moratorium on accountability consequences linked to state tests: “States can and should still administer standardized assessments and communicate the results and what they mean to districts, schools, and families, but without the threat of punitive sanctions that have distorted their importance.” ASCD goes further than ASA and Gates in calling for a complete revamp of accountability practices, including policies regarding teacher accountability:

We need a pause to replace the current system with a new vision. Policymakers and the public must immediately engage in an open and transparent community decision-making process about the best ways to use test scores and to develop accountability systems that fully support a broader, more accurate definition of college, career, and citizenship readiness that ensures equity and access for all students.

So…are policymakers listening? Are these influential organizations able to amplify the voices of researchers and practitioners across the country who also want a moratorium on misguided teacher accountability practices? Let’s hope so.

2 thoughts on “Is this Thing On? Amplifying the Call to Stop the Use of Test Data for Educator Evaluations (At Least for Now)

  1. Did “professors” seriously write this? Where to begin…

    1. The ASA did not say that VAMs are invalid. In fact, they suggested they should be used for larger populations such as across schools and districts. And you happened to leave this retort from Prof Chetty/Rockoff/Friedman: http://obs.rc.fas.harvard.edu/chetty/ASA_discussion.pdf

    2. The Gates Foundation did NOT advocate for a moratorium on VAMs because of any questions about the methodology. They simply suggested that in order to calibrate the models, a few years of data need to be collected on the CC tests (PARCC, SBAC). Your language is deceptive and does not represent their position.

    3. Part of the problem is the ineffectiveness of the teachers itself. Ineffective teachers now conduct “test prep” since they cannot effectively convey the concepts to their students in a way that they retain the knowledge and skills. In the Gates’ MET study, the student surveys showed that the best teachers did not “teach to the test”. If we hired/retained better teachers, we could maintain an expanded curriculum and yet get better growth scores.

    4. There are attempts to confuse the public between achievement (absolute) scores and growth (change in scores). Virtually every teacher on these boards think that low-SES students will definitively hurt their VAM scores. In reality, a teacher could have a class full of students who did not score proficient, yet that same teacher could have the highest VAM in the school or district. You should focus more on teaching teachers what VAMs are than trying to bash them.

    5. Based on this study of teacher prep programs, UNC found that TFA teachers blew away EVERY other cohort. UNC did not originally plan to include TFA teachers. If we simply hired more talented candidates such as STEM majors, we could get the results we want: http://publicpolicy.unc.edu/files/2014/02/Portal_TeachPrep-TestScore_June2010_Final.pdf

    6. The way to hire better teachers is to tell the public what teachers actually earn. Districts put away about 20% of teacher pay into pension contributions (similar to 401K matches). Teachers effectively don’t need to save any of their paychecks for retirement because of these pensions. As a comparison, the military puts away 32% of basic pay since they need to serve fewer years (20 yrs). If we publish private sector pay scales to overcome the financial illiteracy of the general public, we could improve the results and not just measure them: https://drive.google.com/file/d/0B5nQmOh4yk4MYkZ2bzZKQTc4a1k/view?usp=sharing

    You seem very concerned about a few teachers getting inaccurate evaluations. Yet, you don’t care about the millions of kids sitting in the classroom of an ineffective teacher. As Prof Chetty showed in his research, you are literally taking future income from the pockets of disadvantaged kids and placing it in the paychecks of ineffective teachers. That is evil!

    • I appreciate the passion behind your post, even if I do not agree with all of your points.

      I will respond here to the points you make about my post specifically:

      1) I did not claim that ASA stated that VAMs are invalid. I did (accurately) point out that, as reflected in its position statement “ASA is “concerned about the potential negative effects of such policies.”
      2) I did not claim that the Gates Foundation advocated for a moratorium on VAMs based on methodological concerns. Rather, I stated (accurately) that “The Gates Foundation, for example, makes it clear that the foundation supports the use of student test data for educator evaluation even as it declares the need for a two-year moratorium, the purpose of which is to allow teachers the time to adjust to the new Common Core Standards and related tests.”

      Additionally, you claim, “You seem very concerned about a few teachers getting inaccurate evaluations.” My post made no claims or comments about teachers getting inaccurate evaluations. That said, I do think that it is imperative that educator evaluations be accurate.

      Your points #3-6 address issues well beyond the scope of my post. While I disagree with a number of points that you make, you and I do agree on one (perhaps the most important) thing: Effective educators are paramount, and every student needs and deserves an effective teacher.

      I worry that current educator evaluation policies may have detrimental effects on recruitment and retention of great teachers. My concern is based in part on a study I have conducted, funded by the Spencer Foundation, of educator perceptions from four states that use VAMs (Florida and North Carolina) or Student Growth Percentiles (NJ and NY) for educator evaluation. Across the survey (1000+ respondents) and interview (60+ participants) data, one of the most common concerns educators articulate is that the current use of student growth measures for educator evaluation may exacerbate challenges recruiting and retaining good educators. Additionally, when asked how the use of student growth measures has affected their own instruction, the most common response is increase of teaching to the test. This is perhaps unsurprising, given that teachers want to ensure that their students do well on these tests since educators’ livelihoods depend—to some degree—on student performance on these tests.

Leave a Reply

Your email address will not be published. Required fields are marked *