Wednesday, 20 September 2017
Progressions, portfolios and learner agency
Blog writing
This is my second blog piece. It is harder than I thought it would be. It isn't that I don't have lots to say, rather that trying to find a coherent way to express all the thoughts that are jumbling around in my head is quite a challenge. It is like writing uni assignments, but fortunately with no-one grading at the end. However, feedback is both welcome and needed if I am to continue to learn. So please let me know what you think.
Progressions
This year I have been working on a progressions project in the Science Learning Area - it has come about because more and more schools seem to be requiring teachers report on progress against the curriculum in the learning areas of the curriculum, including science.
The intention of progressions is so that students know where they are at, how they are doing, and what they need to do next. In other words, based on John Hattie's Visible Learning and Michael Absolum's Clarity in the Classroom. I have no problem with this intention, in fact I think it is a key part of developing learner agency. However, learner agency seems to have been lost due to how progressions have been interpreted and implemented by schools.
Faulty assumptions
The interpretation of progressions by some schools is underpinned by some questionable assumptions.
1. The levels of the NZC show a logical progression in learning for all learning areas.
This assumption has lead to some schools requiring their staff to report on where students are at in terms of curriculum levels in all subjects. However, if you read the NZC achievement objectives they don't all build upon one another in a clear, distinct way. In our science project, careful examination of the achievement objectives underneath the living, material, physical and earth and space science contexts shows that this is not the case. For example under astronomical systems students investigation the interactions between solar, lunar and earth cycles at level 6 but at level 7 are exploring life cycles of stars.
2. That progress against the curriculum can be broken into year level stages.
Many achievement objectives are the same for more than one level of the curriculum. For example level 1 and 2 science have the same statements - this is 4 years of learning. How does a teacher decide whether a student is below, at or above such a broad band. The draft digital technologies curriculum provides another example. Here at least the MOE has funded the development of progressions but for 'Designing and developing digital outcomes' there are 3 levels of progressions across years 1 to 10.
Portfolios
So how are teachers supposed to be able to identify where a student is at given that the progress may occur over several years and may need to be demonstrated in a number of areas? My answer is that it is now essential that students have portfolios of learning that they carry with them (in the cloud) and collect evidence to show indication of progress in their various curriculum areas. Sue Suckling (NZQA chair) takes this even further with her statement at last years Singularly U summit
"The day of the qualification is over. The era of verification is coming"
To read more see NZQA micro credentialing pilot
Learner Agency
So building up a portfolio (for want of a better word) seems to be the way to go. What do teachers need to do for this to happen? They need to put the learning in the hands of the student - and if students are going to take responsibility for collecting evidence, to verify their progress and achievement, then they need clarity on what progress looks like, and they also need to be supported to develop their competencies so that 'learner agency' isn't just a buzz word, it is reality for every learner. However, learner agency is not, unfortunately, what most students experience every day in their schools.
Where to next?
Which brings me back to the science progressions, and indeed my role as a facilitator. We have developed progressions without any investment from the MOE, rather the support has been provided by our employers - The University of Auckland. But how, in this current PLD environment, can we take it further? How can we take what currently is a nice looking document that has had a heap of challenging thinking put into it and support teachers and students to progress in science?
And on a personal note, do I, in my faciliation role, have enough agency to generate any shifts in practice in this direction, given schools are dictating what effective PLD looks like.
Tuesday, 20 June 2017
Moving forward without a map
I handed in my 2nd assignment yesterday (phew - such a relief to get that finished) and it involved applying research to practice in my role as a facilitator. This has got me thinking about frameworks that might guide out work in this new centrally funded PLD environment. At present I feel like I am moving forward without a map to show where I am heading.
I am currently working in 6 secondary schools, with 3 schools and 1 COL on the horizon. For each of these schools the focus of the work is different;
In each of the schools, there is at least one more facilitator working alongside me. I see this as a very positive thing as it is very easy to get tunnel vision when working by yourself.
I am focussing on being responsive and always try and find out as much as I can to understand the context and issues before working with leaders to determine in more detail what the work might look like. I use school websites, NCEA data, ERO reports, journal applications, annual plans, talk to colleagues who have worked there previously, ask questions and listen to leaders, teachers and support staff, and often observe lessons and gather student voice.
I am wondering, however, whether we should be thinking about a more formalised framework for this work, some form of inquiry process. It would need to be something responsive and flexible which builds in evaluation, to give us direction and a means of reflecting and reporting on our work. In other words something that helps us understand where we are at and where we are heading. I know in some of the 'packaged programmes' that used to be delivered had something like this.
What are other people using to guide your work? How are you evaluating your effectiveness?
I am currently working in 6 secondary schools, with 3 schools and 1 COL on the horizon. For each of these schools the focus of the work is different;
- leadership of professional learning with a focus on embedding teaching as inquiry and a collaborative culture
- literacy within science
- SOLO taxonomy
- Middle leadership support in Science
- Middle leadership development across the school
- High expectations pedagogy
In each of the schools, there is at least one more facilitator working alongside me. I see this as a very positive thing as it is very easy to get tunnel vision when working by yourself.
I am focussing on being responsive and always try and find out as much as I can to understand the context and issues before working with leaders to determine in more detail what the work might look like. I use school websites, NCEA data, ERO reports, journal applications, annual plans, talk to colleagues who have worked there previously, ask questions and listen to leaders, teachers and support staff, and often observe lessons and gather student voice.
I am wondering, however, whether we should be thinking about a more formalised framework for this work, some form of inquiry process. It would need to be something responsive and flexible which builds in evaluation, to give us direction and a means of reflecting and reporting on our work. In other words something that helps us understand where we are at and where we are heading. I know in some of the 'packaged programmes' that used to be delivered had something like this.
What are other people using to guide your work? How are you evaluating your effectiveness?
Subscribe to:
Comments (Atom)