Tag Archives: feedback

The Art of Planning Deeper Learning

A Deep Learning Experience
Deep Learning experiences develop when the learner is able to identify and link pre-existing knowledge and understandings. As the learning becomes deeper the learner is then able to extend and apply their ideas. As teachers this is what our classroom should be all about – to use activities and questions to construct opportunities to achieve more complex and deeper understandings

Recently I had the pleasure of listening to John Hattie speak at a launch of the ‘New Pedagogies for Deep Learning’ project. John is highly regarded and well referenced with his research (meta-analysis) of what makes an impact in the classroom. John talked about the ‘effect sizes’ of number of teaching strategies and spent a bit of time discussing the concept of ‘Surface to Deep Learning’. The need for constructing opportunities to build Surface Understanding and moving students to Deeper Conceptual Understanding. The fact that a lot (up to 90%) of what is happening in schools is based around Surface Understanding indicates current challenge of incorporating Deep Learning. To have a significant impact in the classroom we need to build both Surface Understanding and Deep Understanding, but importantly teachers need to have the skill of knowing when to move between the Surface and Deep. He talked about SOLO Taxonomy as tool for planning and measuring Surface and Deep Understanding. The overview of SOLO was something that although simple seems extremely useful in designing challenges and assessing understanding during Deeper Learning experiences.

In reflection I couldn’t help but draw comparisons with my own Deep Learning that was occurring. John presented facts, thoughts and questions that helped me link some of my previous understandings about learning and my own experiences from my classroom. I was then able to look at learning with a different lens and extend and answer some questions about my approach to learning.

SOLO Taxonomy
SOLO stands for ‘Structure of the Observed Learning Outcomes’. SOLO Taxonomy was developed by Biggs and Collis (1982) and is often described as a ‘framework for understanding’. It describes the processes of understanding used by students when answering prompts. In this way it can be easily used to assess the level of understanding demonstrated by students.

Some levels of SOLO can be described as Surface Learning and others as Deeper Learning. The Surface Understanding is required for the Deeper Understanding to develop (it is not independent).SOLO table

In addition to assessing understanding it can useful for designing strategies and questions
for the classroom. One of the things that makes it work well here is the way it includes content knowledge (Surface Knowledge) as the building block for the Deeper Learning. Therefore as a planning tool it could be very useful to plan differentiated learning or responsive tasks.

The real art of a teacher that John Hattie described was the art of knowing when to move a student from Surface to Deep Learning. Also, knowing the occasions of when to move back to Surface Learning again.

As one of my goals is to increase Deeper Learning with my students this is an ‘art’ and ‘timing’ that I would like to develop. I also believe this is be a broader pedagogical push in education. We are rapidly moving away from students who need to ‘know stuff’ as knowledge is often only a google search away. The skills required of students to undertake Deeper Learning will be more valuable in the information rich world that they will grow up in. The timing is right to move away from the classroom that is based around Surface Knowledge!

Some things to check out about SOLO Taxonomy.
A good summary of Solo Taxonomy – About SOLO Taxonomy
Great video – Solo Taxonomy explained using LEGO
John Biggs – SOLO Taxonomy

Floating with Feedback on the Surface
…………….and then Diving Deeper
A critical challenge for our teaching team over the last few years has been investigating at how we can move towards personalising learning in our classrooms. By this we mean that for every student we know ‘where they are at’ and to then create appropriate experiences to ensure that ‘they move forward’. Essentially we were wanting to identify the ZPD for students and then utilise strategies that include differentiated learning to enable growth. There are many pieces to this puzzle, but we identified a key focus to be on formative assessment to deliver feedback from the learner. In our context this seemed to be the missing link for personalised learning – it was quality feedback from the learner that we needed to improve.

This direction was backed up by John Hattie’s research where he identified ‘feedback’ as one of the strategies with the highest effect sizes. One of the things that ‘expert’ teachers do to make a difference in their classrooms. Importantly he also identified that “feedback was most powerful when it is from the student to the teacher……………and then teaching and learning can be synchronized and powerful”  (Hattie 2009).

We went on a mission and implemented many forms of formative assessment with the aim of delivering precision in learning. We looked for assessment strategies that were frequent, useful, had accessible results and in most cases were enabled by digital technology. Our goal was to be making ‘Daily Data Driven Decisions‘ – The decisions that we make on a daily basis about students learning needed to have the support of useful learning data.

This has resulted in a great shift in our practice and more importantly in the learning that was occurring in class. However, as we reflected on our progress we identified a couple of areas we now wanted to address:

  1. Opportunities for Deeper Learning – We recognised that a lot of what we were delivering and assessing in class was Surface Learning and we wanted to include opportunities for Higher Order Thinking (which I now prefer to call Deeper Learning). It was interesting to hear John Hattie describe that most (up to 90%) of learning in classrooms could be categorised as Surface Learning.
  2. Stimulate ‘Inquiry’ in the classroom – We wanted to cognitively engage students in the scientific problems of the world around them. To achieve this we this we needed to focus on the timing of questioning/challenges.
  3. Collaboration – Collaboration is one of the key principles of our Colleges Pedagogical Vision. We are always keen to look for opportunities to increase Collaboration in the classroom.

Therefore, this next stage of things our goal was to create opportunities for challenging activities for our class (investigations) and to regularly use challenging questions (provocations) with our classes. Both designed to be collaborative and to simulate Deeper Learning. In a previous post ‘Harnessing provocations for Deep Learning‘, I have explored the nature of creating such provocations.

A model I found useful when describing the nature of investigations/provocations is The Learning Challenge (James Nottingham). In this model students enter a phase called ‘The Pit’. In such a phase here might not be an obvious solution to the problem and often there is ‘cognitive conflict’ in the student’s minds about a possible solution. In a well-designed challenge students have the knowledge and understandings (often some of their Surface Learning) to construct a new Deeper Understanding and a solution to the problem. The important concept for us is that the students didn’t have the answers first! We were trying to create an environment where it was a challenge for them to construct understanding. I often explain such problems to students as ‘chin scratchers’ or ‘thinking questions that need a thinking answer’ and that ‘they are tough problems to solve’ and to do this ‘learning is hard work’………..’A Learning Challenge’.

In important part of this next stage of things is that we didn’t want to replace our strategies using Formative Assessment, but instead we wanted to keep these and add challenges to create opportunities for Deeper Learning. We weren’t ‘Moving To’, but it was more the case that we were ‘Adding’.

Enhanced with Collaboration
Collaboration is not only something we value in our classrooms, but additionally it added a lot to the nature of the challenges we were creating.

 

With the inclusion of collaboration:

  • Students are more likely to spend time in ‘The Pit’ – The pit can be pretty lonely by yourself and it can be a rush to get out. It can be a lot more comfortable knowing you are there with the support of others and with this confidence students might be more likely persist with the challenge. Simply, more time in the challenge can mean more time to find links and to extend and develop ideas.
  • Students are more able to get out of ‘The Pit’ – Therefore we can design ‘The Pit’ to be a little deeper and increase the challenge in our classrooms. In Vygotsky’s explanation of the Zone of Proximal Development he describes the movement of a student from their level of ‘actual development’ to the level of ‘potential development’. He defines the level of potential development as the ‘level of development the learner is capable of with collaboration’ (with teachers and peers). It is often this collaboration with peers that is overlooked when considering the potential of learners and instead it is considered an individual limit for students.
  • Students could be more reflective – As students are discussing the problem with others it offers many opportunities for metacognitive reflection. Reflections such as – How has their thinking changed? What do they know now that they didn’t before? What strategy worked? Why did this strategy work? Could this strategy be applied to another problem? How would you change your approach next time?

The skills we aspire to develop in learners to prepare them for the future are developed in experiences that are challenging, collaborative and require deeper thinking.

Timing is Everything!
One of the skills of being an ‘expert teacher’ as described by John Hattie knowing when to move students from surface to deep learning (and sometimes back to Surface).

To plan our timing it has been useful for us to define what we mean by ‘Inquiry’ in our class and to categorise some phases of the learning cycle. It is important to clarify that we are not implementing an Open Inquiry Model where students are creating and researching their own questions. Instead guiding inquiry as we use carefully constructed questions (provocations) to engage learners in the chosen concept and to challenge them to build and extend on their ideas. To help us position these experiences in the Learning Cycle we have used the adapted phases of an Inquiry Model (Karplus) – Explore, Explain, Apply. Many will be familiar with a development of it in the 5E Instructional model (Bybee) – Engage, Explore, Explain, Elaborate, and Evaluate. This isn’t a strict formula of delivery, but more a tool to help us guide this movement in and out of Deeper Learning experiences.

EXPLORE
In this phase we are wanting to stimulate thinking and cognitively engage the students in the key concept. It is not just an interesting introduction but a hook that makes them question their own understanding. In the remaining Learning Cycle this experience offers a great reference point for students to link upcoming learning. A good way to look at this is that the hook can be somewhere students can hang new understandings. Aside from offering this cognitive engagement and reference point it can also be beneficial to expose and gauge learning at this early stage.

The Challenges during this phase might often have a ‘Shallow Pit’ (smaller challenge) in comparison to later problems. As the students may not have collected the required content knowledge and understanding (Surface Learning) they might not be able to tackle a deep problem. A well-constructed challenge might offer a solution, but offer room for the concept to develop and for other ideas to be added as learning develops.

EXPLAIN
In this phase we aim to deliver develop experiences to allow students to be able to understand the concept. We might deliver Focus Lessons which involve explicit instruction to deliver much of the content knowledge. Following this we might include a number of activities and tasks that reinforce this understanding. It is also a phase where we use many of our formative assessment strategies to gauge student progress and where possible differentiate learning.

It is in this phase where we are strategically developing the building blocks (Surface Learning) for Deeper Learning.

APPLY
In this phase we look for opportunities to deliver and assess Deeper Learning as students undertake more challenging questions. Students apply their understanding and link ideas – Justify, Explain, Compare…… (Relational). Students might also extend their ideas and apply their understanding to new situations – Hypothesise, Create, Predict, What Next…… (Extended Abstract). In the Science classroom it would commonly include experiments and investigations to enable these challenges, but can also include questions as provocations for this Deeper Learning.

In particular this is the phase for the bigger challenges. Provocations and Activities that really put the students in ‘The Pit’.
Based on student achievement it may be necessary to move back into the Explain phase may be appropriate to move

Visibility of Deeper Learning
Our initial work with feedback provided visibility of learning in our classrooms that allowed us to more accurately meet student needs. However, the limitation of much of this formative assessment was that it was based around multiple choice and short answer questions and as a result tended to focus on Surface Understanding. As we constructed Deeper Learning experiences for our classrooms it was then important to consider how we might assess understanding during such activities. Our challenge was to have visibility during Deeper Learning so we could increase precision here as well. It is during these that misconceptions and thinking processes can be more visible. Visibility here can be crucial in being responsive and knowing when and who should be engaged in Deeper Learning experiences.

As with our other forms of formative assessment we turned to technology to work effectively. We used a web based app called Verso which allows us to structure student collaboration around a stimulus question (provocation).  After accessing media students post an anonymous response before they read, comment and respond to the answers of others. Students supported by collaboration with their peers are able to build understanding to challenging questions. As a teacher being able to see every students post and interactions gives me great insight into their learning. It is during such challenging questions that misconceptions and thinking processes are even more exposed. Using Verso an additional feature is that you can group responses, this enables responsive instruction where you are able to use data to drive your decisions.

To see how Verso works see my video – A Tour of Verso

Once visible, the next part of the problem was how to measure and assess levels of this understanding. This is where SOLO Taxonomy ties everything together. A simple and effective set of criteria which is suited to assessing the level of understanding in student responses. Certainly it would seem a much more usable tool than Blooms which appears limit itself by linking the standard of the question being asked and the standard of the response. It would also seem that SOLO becomes quite usable as it integrates the content knowledge (Surface Understanding) rather than separates it.

Visibility enables the art of timing Deeper Learning experiences and allows us to respond effectively.

Solo Taxonomy, purposeful provocations, strategic timing, digital technology – these have all been tools used to construct and assess Deeper Learning Experiences.  What else do you think can be used in the art of planning Deeper Learning?

Letting Go and Letting the Questions Flow

Q's pic 2

I am sure there are no arguments that the effective use of questioning and genuine classroom discussion is one of the keys to maximising learning.  In recent years many of my students interactions (including questions and discussions) have moved online……………. and wholly virtual cow, this is a game changer!

The last couple of years I have been flipping a number of my classes with the use of online video.  This has allowed time in the classroom to be increasingly student centred and to be all about the learning activities that work best when we are as a group.   This has resulted in the shift online for both the independent work (mostly at home) and for some of the collaborative work that can flow in/out of the classroom.  Sitting alongside these tasks is our questions and discussions, which we have trialled in a variety of formats – learning management systems, quiz tools, blogs, Google forms and message boards.

I see these online questions and discussions mostly slotting into a few main categories (See detailed reflection of my current state of affairs HERE):

1. Checking student progress – What has been finished? What were the key words in your summary?  What were the concepts that you have summarised?

2. Student ‘help’ questions – What don’t you understand?  What do you need help with? What do you want me to cover in class?

3. Formative assessment – Question’s to test understanding of the concept?

4. Student Inquiry – Genuine student questions related to the concept. What does this make you wonder?  Why does this occur? What areas interest you?

The last category (Student Inquiry) is really where I want to shake things up a little.  After all, when we learn Science, what we are really talking about is the process of students understanding the amazing world around them.  With this in mind, it means that the ‘what does this make you wonder’ questions are what we need to hit to truly cognitively engage students in Science.

I believe there are some great benefits to the other types of questions and strategies I am using in my class.  However, I am wary that with these alone it could lead to a more ‘factory like’ content delivery schedule and I could miss some of this essential engagement and inquiry.  I also wonder if my previous attempts to curate these types of student inquiry questions on our blog/message boards have actually got in the way of the natural flow of this type of questioning – too much control!

Some questions for me:

  • As a science teacher how can I foster natural inquiry and discussion about Science?
  • Could this be another area in the student centered classroom where I need to step back and release responsibility?  Am I able to let the questions flow naturally?
  • Where in the learning cycle would be the best place to encourage this?
  • How can I facilitate this so that it will be intuitive for students to ask and answer?

As I step back and decide where to next, I have a few guiding thoughts:

  1. Don’t start with the tool – I want to be really clear about my needs before I choose and settle into a tool.  Don’t get me wrong, I still intend to get really excited by the fantastic tools out there.   Currently I am looking forward to getting on Verso to enable the setup of the inquiry.  There are also some great options for putting quizzes alongside your videos – Google forms (now with embedded video), Sophia, Camtasia and TED Ed.
  2. Go for something intuitive – It needs to be intuitive for students.   It can’t be too many clicks away or too hard to post.  Having notifications could also be beneficial to hook students back into the discussion.   For many the discussion formats of Facebook, Verso, Edmodo , Schoology and Ask 3 (if an iPad school) are all the sort of online discussions that seem intuitive for students.
  3. Plan for Inquiry – Plan opportunities and activities that encourage students to inquire.  Plan activities that scaffold the skills to ask and answer questions.  Consider the timing/position of these discussions in the learning cycle.

I would love to hear from other educators about their approaches in using ICT to create questions and discussions that promote inquiry.