This post will be most helpful for teachers who see the value in Themantic Education’s teaching principles and practices, especially those already using our teacher support packs.
Having an administrator or even a peer observe your lessons can be a stressful time, even for the most experienced teachers. The practical applications of our themantic model of curriculum design™ can give you the confidence to know that any of your lessons will meet even the strictest criteria for teacher evaluation success.
Here are five questions that any good evaluator would want to see in any lesson:
- Where are your learning outcomes?
- Where’s your formative assessment?
- How are you differentiating?
- Are your lesson outcomes aligned with your summative assessments?
- “So what?”
Here’s how I would answer these questions if an evaluator was observing any of my themantic lessons:
Where are your learning outcomes?
My outcomes are provided for students every lesson. All of these can be found in the student workbook and are in alignment with those in my unit plan.
Students are given all the learning outcomes in the form of questions to answer becaus this makes the outcomes easier for students to comprehend. Giving problems to solve is also far more engaging than giving commands to follow.
So if you (the administrator), a student, a parent, or anyone wants to know what we’re learning at any one time, just check my unit plan or the kids’ workbooks.
Here’s an example of how I show my students exactly what the outcomes are by putting the topic and lesson questions in the workbook…
Where’s your formative assessment?
By using Themantic Education’s CHACER lesson plan I gather formative assessment data at two points in every lesson (The C’s).
The first is the C – Consolidation. During this part of the lesson I’ll have activities designed consolidate prior learning. Things like Kahoots, Jeopardies, Quick Quizzes, Mind Maps, and even simple Q&A sessions allow me to get some data to inform me of the success of prior teaching.
I then get more data during the second C: Check-In. Every lesson has a guiding question which is the core outcome. Most of the time students show understanding by answering the question in their workbooks and showing me directly, but it might also be the product of an activity we’re doing, like constructing a mind-map, solving a murder mystery or simply taking a quiz after a jigsaw. However it happens, I still get individualized data and can provide effective feedback.
How are you differentiating?
Where to begin?
The first aspect of differentiation is embedded within the themantic model of curriculum design, which includes the CHACER lesson framework. As already mentioned, every lesson the students have questions to answer, but these are framed using our three levels of learning:
- Knowledge and comprehension
- Understanding and application
- Creative and critical thinking
Surface level knowledge questions (green) provide scaffolding for students to obtain the knowledge necessary to progress to deeper understanding.
The understanding questions (guiding questions; orange) are the core outcomes that students are aiming for.
The extension questions (red) provide fast finishers a chance to extend and challenge their thinking.
This is reflected in the CHACER lesson plan:
- By giving students the time to answer the question during the A-Activity phase of the lesson allows students time to comprehend new learning and apply it to the question.
- Any student that manages to demonstrate understanding in the Check-In stage of the lesson before the lesson ends has a chance during the E-Extension phase of the lesson to extend and challenge themselves.
Framing lessons around the three levels of learning ensures that all students are learning at an appropriate level.
There’s also differentiation in delivery of content as students have a range of resources they can use to comprehend and understand new information, including the textbook, the blog and the workbook.
Students who need additional support can also have premedial interventions: instead of catching up on work after the class has happened, they can get a head start by reading the upcoming lessons in the textbook and trying on their own to comprehend new terminology, either by themselves or with the help of a friend, parent, tutor, etc. But being able to answer the knowledge questions before the lesson even begins, students who take longer to comprehend information will have more time in the lesson work on understanding. This is a more effective intervention strategy than trying to have remedial help.
Are your outcomes aligned with your assessments?
At the individual lesson level of the themanitc model, we’ve seen already, the understanding outcome (guiding question) is the same question that students show their ability to answer during the C: check-in phase of the lesson. My formative assessment gathers data to check understanding of that core outcome, while consolidation activities mostly gather data on knowledge questions. In this way, the assessment and outcome are aligned.
At the topic level, the topic’s guiding question (e.g. “How can emotional affect memory?”) is based on the topics (i.e. outcomes) outlined in the IB Psychology guide and these are assessed in IB-style summative assessments, like short-answer questions and essays. So while the outcome is phrased as a question, it’s assessed by an SAQ that uses a command term. For example, how emotion can affect memory is assessed by asking students to “Explain one effect of emotion on cognition.” In this way, the summative assessments are aligned with the lesson and topic outcomes.
All of this will be pretty abstract, so it helps to use the unit plans and workbooks to demonstrate.
The individual lesson outcomes are also designed to build students’ knowledge and understanding so they can address the topic guiding question. Extensions are designed so they can think critically about the topic (this latter skill is shown in essays).
The rubrics that I use are also based on the three levels of learning, which use consistent definitions to the lesson outcomes.
Here’s an extract from one of the rubrics I use that is based on two of the three levels of learning.
The theme part of the themantic model of curriculum is where we plan units carefully by selecting ideas and themes that are going to be relevant for students, both now and in the future.
While an evaluator might not phrase the question this bluntly, they should want to know why the topic/s you’re teaching are relevant for students and why they should care.
For example, instead of teaching neuroplasticity by looking at London taxi and bus drivers (which has little immediate relevance), we teach it by looking at the effects of parenting and poverty on children. This is relevant because many of our students will one day be parents and if they want their children to be healthy and happy (as most parents do) they should know how their behaviour can affect the brain. They’ll also be tax-payers and voters one day soon, too, and so they should be able to make informed decisions when it comes to choosing to vote for someone who promises to pump money into the military or into social welfare.
We can also look at neuroplasticity by studying mindfulness and how this can change the brain in a way that could facilitate stress-reduction strategies. This is highlight relevant as students, especially IB students, are going to face stress and they should know how their thinking can affect this.
Similarly, we look at how prolonged stress can damage the brain by atrophying neurons. Hopefully this learning will also stay with them in the future, too, so they can be mindful of sources of stress in their life and how they might combat this.
One of the major themes that runs through the entire course is that we need to understanding origins if we want to create effective prevention strategies – we can’t just treat the effects. In criminology, we can’t just shut people out of the world in isolation as this want address the root cause of violence and might make it worse. Similarly, we can’t just give medication to people with disorders like PTSD because won’t address the etiologies.
Administrators mean well with their evaluations but those who aren’t classroom practitioners often lose sight of how difficult it is to be doing all of these things day-in, day-out, every lesson, all the time. At ThemEd we think our model and strategies work on an intuitive level and we hope that they can reduce teacher stress and can make it easier to cause learning to happen. This will help you pass any evaluation with flying colours, while you and your students are having fun along the way.
Have I missed a key criterion that your admin’ look for when they’re observing you? Do you disagree with any of our principles or practices? Leave a note in the comments as we need our ideas challenged!
Travis Dixon is an IB Psychology teacher, author, workshop leader, examiner and IA moderator.