I came across a great report just release (November 2008, see below) that not only addresses the need for 21st century skills, but also how we need to develop assessments that will measure 21st century skills not found in the current assessment system. This is truly at the heart of TICAL's core mission and we need to delve in this conversation. I would highly encourage you to read the report and I would be interested to hear your toughts in this forum and our next cadre meeting in January.
(This report is a product of Education Sector’s Next Generation of Accountability initiative. The initiative seeks to strengthen public education by examining key elements of accountability, for instance, who should be responsible for student success and how they should be held responsible. Our work seeks to build on the strengths of current school accountability systems, more fully and effectively measure the depth and breadth of students’ educational experiences, and encourage educators, parents, policymakers, and the larger public to pursue educational equity and excellence for all students.)
Thanks for passing this on. The report was very interesting. Of particular interest were the 2003 results of the PISA exam. Our students usually do very poor on this exam. I think we are at first tempted to blame NCLB on our lack of focus on problem solving, but these results show that NCLB may not be the culprit. NCLB was signed into law in January of 2002, and this test measured 15 year olds in 2003, only a year into NCLB. It leads me to believe our schools were failing at these areas far before 2001. Many of us think if only NCLB goes away, our schools will do better in 21st Century skills. I think this shows us that we have a lot of work in front of us to ensure that our schools are teaching these skills, and we will need to have measurements in place to assess how we are doing. I say, let's go see what they are doing in Finland. Even Sir Ken is impressed by Finland's schools. Field trip anyone?
This is an exciting prospect. Having assessment become more than just filling in the bubbles. And how to do this on a broader scale. Having taken the national test to become an administrator, I agree that having to answer in depth to the scenarios listed in the test and having to pull together and justify my answers to situations seems far more relevant than answering a multiple choice test, or even writing an action research or thesis paper. Being judged against a national standard for excellence (with each state having the option of deciding on what its passing level of competence is) was a satisfying experience. And yet the process of writing all this out and having it mailed in seems so time-consuming and bulky.
This is where the technology in use I saw in my current AB 430 class becomes so relevant. One of the participants was using a notebook on which she could write answers which she told me could be converted to text for emails, files, etc, or, she, could of course, just type things up. With the advent of this type of technology, the possibility of having all students complete this type of testing and be judged by the same standards whether at a site level, district level, state level, or national becomes far more 'doable'. Multiple people could be looking at the same text at the same time. Results could be quickly transmitted back and forth. Actual feedback could be given to the test-taker by those who have judged the assessment. There would even be the possibility of response to feedback from the test taker. The possibilities are very exciting to me.