Audrey Amrein-Beardsley, Ph.D.
Lead Blogger and Creator of VAMboozled!

Audrey Pic.imageAudrey Amrein-Beardsley, a former middle- and high-school mathematics teacher, received her Ph.D. in 2002 from Arizona State University in the Division of Educational Leadership and Policy Studies with an emphasis on Research Methods. Awarded tenure in 2010 as an ASU Presidential Exemplar, she is currently a Professor in the Mary Lou Fulton Teachers College. From 2011-2014 she was also named as one of the top edu-scholars in the nation, honored for being a university-based academic who is contributing most substantially to public debates about the educational system. Audrey’s research interests include educational policy, research methods, and more specifically, high-stakes tests and value-added measurements and systems. In addition, she researches aspects of teacher quality, teacher evaluation, and teacher education. She is  the creator of the blog – VAMboozled – in which she and a team of colleagues write about teacher evaluation, teacher accountability, and value-added model (VAM) related issues across the nation, and she is the creator and host of an online biographical show titled Inside the Academy during which she interviews some of the top educational researchers in the academy, and archives their personal and professional histories for educational audiences, online, for years to come. Audrey’s latest book, Rethinking Value-Added Models in Education: Critical Perspectives on Tests and Assessment-Based Accountability, takes a critical look at VAMs and offers research-based concerns and recommendations regarding the use of VAMs for teacher evaluation purposes.

Audrey Amrein-Beardsley’s Curriculum Vitae
Email Audrey

Tom Haladyna, Ph.D.
Guest Blogger – Tests and Assessment

TomTom Haladyna has been a life-long educator. During his career, he has been an elementary school teacher, university professor, research professor, and test director. His doctoral studies were concentrated in statistics, measurement, and research methods. He was a visiting scholar at the U.S. Navy Research and Personnel Development Center and a National Assessment Scholar at the Educational Testing Service. Tom has authored, co-authored, or edited 14 books, more than 60 journal articles, and more than 200 articles, reports, white papers, opinions, and technical papers on validity, test development, and item development. He is co-editor of the Handbook of Test Development (2006) with Steve Downing. This book has been translated into Spanish and Japanese to reach a larger audience. He is also co-editor of the second edition of the Handbook, scheduled for publication in 2014. Tom has consulted for more than 100 organizations including the American Dental Association, Microsoft, Certified Financial Analyst Institute, Association of Social Work Boards, American Compensation Association, Arizona Supreme Court, Educational Testing Service, Motorola Corporation, National Association of State Boards of Accountancy, Oregon Department of Mental Health, the U. S. Army, Colorado Peace Officers Standards and Training Board. He specializes in item and test development and validation and validity studies.

Tom Haladyna’s Curriculum Vitae
Email Tom 

Jessica Holloway-Libell, Ph.D. 
Guest Blogger and Blog Manager

JessicaJessica Holloway-Libell recently earned her Ph.D. in Education Policy and Evaluation at Arizona State University. She has twelve years’ experience in the field of education, both in practice and in research. She currently teaches Structured English Immersion courses for pre-service teachers and conducts research on teacher evaluation processes and methods. She also contributes blog entries as well as manages the blog, VAMboozled! Jessica’s research interests are in teacher evaluation systems, specifically those that rely in significant part on student growth data as measured by value-added models. Her dissertation looks at the ways in which teachers and their evaluators make sense of and experience teacher evaluation processes under a comprehensive evaluation system (i.e., TAP The System of Teacher and Student Advancement). She is specifically interested in how teachers and teacher quality have been discursively constructed and constituted by market-based discourses.

Jessica Holloway-Libell’s Curriculum Vitae
Email Jessica

Kimberly Kappler Hewitt, Ph.D.

Guest Blogger — Student Growth Measures

KapplerHewittKimberly Kappler Hewitt served as a middle and high school teacher in Atlanta, Georgia, before becoming an elementary principal and then district administrator in Ohio.  She earned her Ph.D. in educational leadership from Miami University in 2009.  She currently serves as an assistant professor of educational leadership at the University of North Carolina Greensboro.  Her research focuses on the ethical and efficacious use of educational data, from instructional leadership and policy standpoints.  She is the editor of Postcards from the Schoolhouse:  Practitioner Scholars Examine Contemporary Issues in Educational Leadership (NCPEA Press, 2011) and author of Differentiation is an Expectation:  A School Leader’s Guide to Building a Culture of Differentiation(Routledge, 2011).

Email Kimberly

Noelle A. Paufler, Ph.D.
Guest Blogger — Educational Policy

Profile PhotoDr. Noelle A. Paufler earned her Ph.D. in Educational Policy and Evaluation from the Mary Lou Fulton Teachers College at Arizona State University. Her research interests include K-12 educational policy, specifically how educational leaders enact policy into practice and its impact on teachers and students. Her most recent research examines the lived experiences of principals and teachers and the impact and (un)intended consequences of accountability policy reified in teacher evaluation systems in public schools. She also serves as a researcher and website manager for Inside the Academy, an online educational historiography hosted by Dr. Audrey Amrein-Beardsley that honors the distinguished personal and professional contributions of exemplary educational researchers and scholars to the field of education and the academy.


Margarita Pivovarova Ph.D. 
Guest Blogger – Economics
Margarita Pivovarova earned her Ph.D. in Economics from the University of Toronto in 2013 where she focused on Economics of Education. She joined the Mary Lou Fulton Teachers College at Arizona State University as an Assistant Professor in Education Policy and Evaluation in the summer of 2013. Margarita’s research interests include communicating economists’ views on current needs, initiatives, policies, and reforms in the education domain, as well as bridging together education and economics to advance educational research using econometric methods. She also studies knowledge spillovers among elementary school students, gender and ability peer effects in education, and how the existence of peer effects can inform classroom design for the betterment of student learning and achievement. Currently, she is investigating the use of value-added models (VAMs) to inform different models of teacher effectiveness, as well as examining how different incentives for teachers in schools change teachers’ behaviors and impact student achievement.

13 thoughts on “TheTeam

  1. We are confused… the PARCC a normed test or criterion? We’ve heard different things. Thank you! Michelle

    In answer to your query, norm-referenced and criterion-referenced are types of test score interpretations. Strictly speaking, no test is norm-referenced or criterion referenced. A norm-referenced interpretation allows us to compare student scores. A criterion-referenced interpretation allows us to assess how well a student is doing in a domain of interest (such as reading, writing, or mathematics). Sometimes, we set a cut score to help us determine how well a student is doing. 75% is a cut score, and a student might be well above or well below that cut score. The common core tests are very comparable to previous national tests. These tests provide norm- and criterion-referenced interpretations, because that’s how they were designed. All these new tests are professionally developed following our national test standards and principles governing validity. Upon review, you will find these tests to be of very high quality.

    Tom Haladyna
    Professor Emeritus
    Arizona State University.

  2. Has there ever been any examination of the effect that chronic absenteeism has on VAM scores? In schools with high absenteeism where a teacher rarely or might never have the same group of students in a specific class each time that class meets, does any adjustment VAM attempts to make for that situation come close to functioning? IS the adjustment an on/off switch or a range? At what level does the absence of the same students or a varying group of students invalidate VAM before any other effect? It seems to me that there must be some level of absenteeism that invalidates any VAM score before any other consideration comes into play. I’ve looked for information on this a couple of times over the years and have always come up empty.
    PS. apologies for the multiple channels I used to get this question to you folks.

  3. I teach in a state that just released their first VAM rankings through EVAAS. In calculating the scores we were asked to change the percentage of time spent in the classroom for students only if they missed 9 or more days. My school operates on a 90aminutr, two-semester (90 days each) schedule. So missing 9 days would be like missing 18 at the elementary level. The number of days we were allowed to claim were not adjusted based on that different schedule. We were not allowed to claim less responsibility for students who were absent 8 or fewer days either.

    • That’s what we call “objective” science, that yields an “objective” truth, unfortunately. The arbitrariness that goes into these systems, such as you describe here, is the arbitrariness also at issue in some of the lawsuits currently underway. Hopefully judges like the one in New Mexico will continue to recognize this in our favor.

  4. My school, a private religious high school, is considering the Danielson Framework for teacher obserations. However, after reading the book and doing some research, I am troubled that it is not what it seems. It claims to be reseach based, yet is founded on a contructivist theory of learning. About the only thing I can find that attempts to demonstrate its effectiveness comes from the Sutton Trust’s report, What Makes Effective Teaching, where the authors found that there was not really any connection between student learning and the framework.

    Since this is such a widely used observation instrument–and as it’s premised on contructivism–is it really “research based” or worth pursuing? I am especially concerd as there is little in the way of quantitative research availble to support constructivist pedagogical practices. And, is there a reaseach based observational model that has demonstrated worth?

      • Hi Audrey–thanks for the response and reaching out to a colleague. Just to clarify, when I’ve read critiques of the Danielson Framework, they’ve usually been in the context of how it has been utilized and the ratings produced rather than whether the underlying pedagogical philosophy (constructivism) is sound.

        John Hattie last year openly stated: “We have a whole rhetoric about discovery learning, constructivism, and learning styles that has got zero evidence for them anywhere.” Also there is the cycle of papers from Kirschner, Sweller and Clark (and others) emphasizing that constructivist pedagogy (which is a bit slippery–the Danielson Framework uses it to push more student centered or student led classroom activities) is only useful for experts and not novices.

        Given that the Framework is intended to improve classroom instruction and teacher habits, if the things being labelled as “proficient” or “highly proficient” are from a pedagogical model that has little or no research support, is this doing more harm than any good which may come from some of the benefits evaluations/observations may provide?

  5. Thank you very much for providing important information. All your information is very valuable to me.
    Village Talkies a top-quality professional corporate video production company in Bangalore and also best explainer video company in Bangalore & animation video makers in Bangalore, Chennai, India & Maryland, Baltimore, USA provides Corporate & Brand films, Promotional, Marketing videos & Training videos, Product demo videos, Employee videos, Product video explainers, eLearning videos, 2d Animation, 3d Animation, Motion Graphics, Whiteboard Explainer videos Client Testimonial Videos, Video Presentation and more for all start-ups, industries, and corporate companies. From scripting to corporate video production services, explainer & 3d, 2d animation video production , our solutions are customized to your budget, timeline, and to meet the company goals and objectives.
    As a best video production company in Bangalore, we produce quality and creative videos to our clients.

Leave a Reply

Your email address will not be published. Required fields are marked *