The MINNESOTA VIVA TEACHERS REPORT was released this week. It's generally pretty good except for its very glaring sidestep/misstep of the issues related to measuring student achievement. The report seems to assume that all measurement of student achievement is created equal. The report talks about VAMs (Valued Added Measures) as if everybody knows and understands VAMs when in reality VAMs are relatively new to education and not widely understood. The report rightly excludes VAMs as not appropriate for decisions about teacher employment issues but allows them as a helpful tool to determine whether curriculum or teaching strategies have improved student achievement.
The problem with this is that value added measures, as they currently exist, are really only useful to measure a very limited scope of curriculum and teaching strategy. Value Added Measures are numbers that reflect how a student does taking a test compared to a previous time taking the same or a similar test. For the most part, we only have tests in reading, math, and science. There's a lot more to K-12 education besides reading, science and math. And, there's a lot more than one person responsible for an individual student's learning even in reading, science, and math.
The big missing piece is that the VAMs don't look at what the teacher, or teachers, or school, or anybody else does between the times the student takes the test. That's the teaching part. The report does a fair job of explaining the many variables and limitations of measuring Student performance which is an obvious reason why they shouldn't be used to make decisions about teacher performance.
I know that I'm making things more difficult for those that want an easy way to measure teaching and learning. We tend to think it should be easy because all of us have made subjective measures of teachers and teaching since we entered our first classroom. And, we've frequently found plenty of other subjective reports to support our measurements. But, individual subjective assessment and even a collection of those individual subjective assessments is not the same as professional objective assessment of the art of teaching that is consistent across an entire school district or beyond.
A good place to start with making this very complex situation more manageable would be to start focusing on formative assessment instead of summative assessment. If we do enough formative assessment and are careful about our recording and communication of that assessment, summative assessment becomes unnecessary. We won't need standardized tests. All teachers have always done formative assessment, but only recently have we had the ability to record and communicate those observations, quiz results, and homework grades effectively. Getting a good score on the test at the end of the year or even the end of a unit is not the same as learning. With the tools we have available, we don't need to have students take standardized test; we have the ability to record and communicate student learning as it occurs.
I agree with the previous comment that this is a valuable take on a thorny issue.
ReplyDeleteAt the end of the piece, Dan suggests that good formative assessment, carefully collected and categorized, makes summative assessment somewhat superfluous. I agree. But I don't think we're quite at the point where it is easy to create and collect good formative assessment. Maybe asking for "easy" is off-target. Thoughts?
Thanks, Seth. Continuing on the point of agreement - good formative assessment, accurately reported and stored, could or would make summative assessment irrelevant. You're absolutely correct that the current methods and processes of formative assessment aren't easily widely reported. The question then becomes is it better to continue as we're doing with summative assessments, or take a new course that will yield much better results. Once there's a stronger consensus about the benefits of sufficient formative assessment as an alternative to summative assessment, the path ahead will easier. The current process really isn't easy, either, is it? It's just familiar.
ReplyDeleteI'm not sure I agree that formative assessment can replace summative assessment. Formative assessment should be used as a no-risk opportunity for students to demonstrate what they understand. It should be used to help the teacher guide instruction and potentially even exempt students from some of the content they may have mastered. I'd like to think that a summative assessment is more of a product that a student creates using higher level thinking skills that bring concepts together. Formative assessment is usually reserved for lower level thinking skills like multiple choice; true/false; matching; short answer; etc.
ReplyDeleteJon, you're right that formative assessment is not a 1 for 1 replacement for summative assessment. Summative assessment as you describe it, a student product, is different than the standardized once a year tests that are being used as summative assessments. I'm envisioning formative assessment that is stored and reported as part of a portfolio which would hopefully also include student created products that would be summative expressions of their learning. Formative assessments and comprehensive portfolios can, I think, make standardized tests irrelevant.
ReplyDeleteThis comment has been removed by the author.
ReplyDeleteYou might find this evaluation software of use, as it focuses on the skills and abilities that are essential for learning: http://merge-education.com/sets-evaluation-management-software.php
ReplyDeleteI enjoyed over read your blog post. Your blog have nice information,
ReplyDeleteI got good ideas from this amazing blog.
goldenslot
gclub
gclub casino