VAMs at the Value-Added Research Center (VARC)

Following up from our last post including Professor Haertel’s analysis of the “Oak Tree” video, produced and disseminated by the Value-Added Research Center (VARC) affiliated with the Wisconsin Center for Education Research at the University of Wisconsin-Madison, I thought I would follow-up, as also requested by the same VAMboozled! reader, a bit more about VARC and what I know about this organization and their VAM.

Dr. Robert H. Meyer founded VARC in 2004 and currently serves as VARC’s Research Director. Accordingly, VARC’s value-added model is also known as Meyer’s model, just as the EVAAS® is also known as Sanders’s model.

Like with the EVAAS®, VARC has a mission to perform ground-breaking work on value-added systems, as well as to conduct value-added research to evaluate the effectiveness of teachers (and schools/districts) and educational programs and policies. Unlike with the EVAAS®, however, VARC describes its methods as transparent. Although, there is actually more information about the inner workings of the EVAAS® model on the SAS website and via other publications than there is about the VARC model and its methods, this is likely due to the relative youth of the VARC model, as VARC is currently at year three in terms of model development and implementation (VARC, 2012c).

Nonetheless, VARC has a “research-based philosophy,” and VARC officials have stated that one of their missions is to publish VARC work in peer-reviewed, academic journals (Meyer, 2012). VARC has ostensibly made publishing in externally reviewed journals a priority, possibly because of the presence of the academics within VARC, as well as its affiliation with the University of Wisconsin, Madison. However, very few studies have been published to date about the model and its effectiveness, again likely given its infancy. Instead (like with the EVAAS®), the Center has disproportionally produced and disseminated technical reports, white papers, and presentations, all of which (like with the EVAAS®) seem to also be disseminated for marketing and other informational purposes, including the securing of additional contracts. Unfortunately, a commonality across the two models is that they both seem bent on implementation before validation.

Regardless, VARC defines its methods as “collaborative” given that VARC researchers have worked with school districts, mainly in Milwaukee and Madison, to help them better build and better situate their value-added model within the realities of districts and schools (VARC, 2012c). As well, VARC defines its value-added model as “fair.” What this means remains unclear. Otherwise, and again, little is still known about the VARC model itself, including its strengths and weaknesses.

But I would bet some serious cash the model, like the others, has the same or similar issues as all other VAMs. To review these issues, please click here to (re)read the very first post on VAMboozled! (October 30, 2013), about these general but major issues.

Otherwise, he are some additional specifics:

  • The VARC model uses generally accepted research methods (e.g., hierarchical linear modeling) to purportedly measure and evaluate the contributions that teachers (and schools/districts) make to student learning and achievement over time.
  • VARC compares individual students to students who are like them by adjusting the statistical models using the aforementioned student background factors. Unlike the EVAAS®, however, VARC does make modifications for student background variables that are outside of a teacher’s (or school’s/district’s) direct control.
  • VARC controls include up to approximately 30 variables including the standard race, gender, ethnicity, levels of poverty, students’ levels of English language proficiency, and special education statuses. VARC also uses other variables when available including, for example, student attendance, suspension, retention records and the like. For this, and other reasons, and according to Meyer, this helps to make the VARC model “arguably one of the best in the country in terms of attention to detail.
  • Then (like with the EVAAS®) whether students whose growth scores are aggregated at the teacher (or school/district) levels statistically exceed, meet, or fall below their growth projections (i.e., also above or below one standard deviation from the mean) helps to determine teachers’ (or schools’/districts’) value-added scores and subsequent rankings and categorizations. Again, these are relatively determined depending on where other teachers (or schools/districts) ultimately land, and they are based on the same assumption that effectiveness is the average of the teacher (or school/district) population.
  • Like with the EVAAS®, VARC also does this work with publicly subsidized monies, although, in contrast to SAS®, VARC is a non-profit organization.
  • Given my best estimates, VARC is currently operating 25 projects exceeding a combined $28 million (i.e., $28,607,000) given federal (e.g., from the U.S. Department of Education, Institute for Education Sciences, National Science Foundation), private (e.g., from Battelle for Kids, The Joyce Foundation, The Walton Foundation), and state and district funding.
  • VARC is currently contracting with the state departments of education in Minnesota, New York, North Dakota, South Dakota, and Wisconsin. VARC is also contracting with large school districts in Atlanta, Chicago, Dallas, Fort Lauderdale, Los Angeles, Madison, Milwaukee, Minneapolis, New York City, Tampa/St. Petersburg, and Tulsa.
  • Funding for the 25 projects currently in operation ranges from the lowest, short-termed, and smallest-scale $30,000 project to the highest, longer-termed, and larger-scale $4.2 million project.
  • Across the grants that have been funded, regardless of type, the VARC projects currently in operation are funded at an average of $335,000 per year with an average funding level just under $1.4 million per grant.
  • It is also evident that VARC is expanding its business rapidly across the nation. In 2004 when the center was first established, VARC was working with less than 100,000 students across the country. By 2010 this number increased 16-fold; VARC was then working with data from approximately 1.6 million students in total.
  • VARC delivers sales pitches in similar ways, although those affiliated with VARC do not seem to overstate their advertising claims quite like those affiliated with EVAAS®.
  • Additionally, VARC officials are greatly focused on the use of value-added estimates for data informed decision-making. “All teachers should [emphasis added] be able to deeply understand and discuss the impact of changes in practice and curriculum for themselves and their students.”

Leave a Reply

Your email address will not be published. Required fields are marked *