When the IB asks the wrong question…

Travis DixonAssessment (IB), Curriculum, Themantics

The IB often gets it wrong when it comes to exam questions. (Image from Bigstock)

In the most recent IB Psychology exam the wrong question has been asked. And I’m not talking about the “etic” question, either. I’m referring to the SAQ from the cognitive level of analysis:

Describe one study investigating the reliability of one cognitive process.

This is obviously the wrong question to ask and may result in poor student performance, unless you’ve trained your students how to deal with just such a situation when the IB asks the wrong exam question.

This post might ruffle a few feathers, but it shouldn’t. It’s not aimed to embarrass or infuriate, but rather to illuminate and inform. I’m hoping that this post and others like it will:

  1. Help teachers teach and students learn
  2. Help the “powers that be” in the IB make the necessary changes to their policies and practices so we can have more transparency, validity and reliability in our IB Psychology assessments.

So what question should have been asked? This one:

Explain how one study investigated the reliability of one cognitive process. 

The difference is so slight, that you might have missed it. The word “how” in this question is essential, because in reality it is what the examiner’s will be looking for.

Example Answer One: Taku

Let’s use a student, Taku, to demonstrate my point. Now Taku is not one of my current students (because he’s fictional), but he could have definitely been one of my students in the past when I treated the IB command terms like gospel. In my early years, I trained my students diligently in the difference between outline and describe and whereas outline is brief describe is detailed and so they had to give detailed summaries of the studies if they were asked to describe in a question (like it has been in the current exam). Now Taku’s a brilliant student who should be getting a 7, but actually he’ll end up scraping in with a six. He studies hard and he can remember heaps of details of studies. So when this question comes up in the exam he rubs his hands with delight because Loftus and Palmer (1974) was pretty easy to comprehend and he knows it inside and out, so this is the answer he writes.

It’s not terrible, but it’s not great. It would (should) not score top marks, I don’t think.

I feel really bad for Taku. He’s done exactly what the question asked, but he won’t receive top marks for it because the wrong question has been asked. He’s actually being expected to write an answer for a different question.

Example Answer Two: Min

Perhaps we should compare that to the sort of answer I’m expecting (hoping?) about half of my current students will be able to write in response to this year’s exam question, even though it’s the wrong question. I’ve learnt from my mistakes and I now actively teach students to ignore the command terms in SAQs if they say outline or describe.* In fact, I tell students that there’s no difference between outline and describe in questions. They’re useful terms for formative work and for them to maybe think about what level of detail to include in essays or even within an SAR, but in terms of guiding their short answers, explain is always the way to go.

Min’s had the benefit of my steps away from the misleading guidance of the IB’s command terms and now knows that actually SAQs are requiring her to demonstrate her understanding of a core concept in psychology and that she must use supporting evidence. And she also knows, after 18 months of experience, that to show understanding you must explain. She uses ThemEd’s definition of explanation, which is the communication of understanding of one or more significant relationships in response to a question or problem. So when Min sees this year’s memory reliability question, she knows to:

  1. Cross out describe and write explain
  2. Explain how the study is related to memory reliability

But Min also realizes that you can’t have a good explanation without signposts and description, so she knows to carefully select the relevant details of L&P’s experiment to make the point she wants to about how the study investigated memory reliability. She does this all in her planning.

This is the answer that Min came up with.

Which one do you think is better?

They’re both pretty good, but I think Min’s is a lot better. Why? She demonstrates evidence of a conceptual understanding: leading questions can act as false information, leading to the misinformation effect. She actually explains the study, more than just describing it, like Taku.

Here are my annotated examples of Taku’s answer and Min’s answer. You can see that I would be telling Taku, you’ve shown excellent knowledge, but limited understanding. I’d be telling Min, on the other hand, you’ve shown very good knowledge and excellent understanding.


Are they that different?

Even if you were to argue that these answers are similar in quality, let’s think about it from a student, course and revision stand point. Look at how much Taku had to remember about that study. Could you imagine him trying to remember this same amount of detail for all the studies that he’s learned about in the course? Min, on other hand, remembers the essential details. Her revision is less about building blocks, and more about using these to form important relationship chains.

But I believe that my student assessment is always a reflection of my teaching. For Taku, I overburdened him with lengthy “key study summaries” that were 1,000 words long and distracted from the real reason why he was learning the study in the first place. When he goes on to write essays, his answers will be just as descriptive: the bugbear of examiners. I can look back on my previous students and I know why so many who  could have achieved 6s and 7s, were getting 5s and 6s.

Min, on the other hand, could revise studies in less detail, which requires less memorization of building blocks. But she was able to make important conceptual connections between the studies and the concepts. This deep understanding is easier to recall once it’s grasped – it just takes time to grasp in the first place.

Furthermore, from a pedagogical standpoint, I’m far happier with Mins’ answer. We don’t teach studies so kids know studies; we teach studies so kids grasp core concepts in psychology and they can see where these concepts come from and that there’s evidence to support them. Ideally, those core concepts can transfer beyond the study of psychology, too.

I’m trying to increasingly refer to studies as the “evidence” in my classroom, so the kids put the core concept (argument) first and the “evidence” supports that concept (i.e. it comes second. Obvious for many, I know.

Implications For The New Guide

With the assessments in the new guide, I’m predicting a lot more questions that will ask students about a study in relation to a topic/concept, just like this year’s memory reliability question. Understanding just what these questions require is essential.

I will be publishing a book through ThemEd in early 2018 which will be a student’s guide to writing excellent exam answers using our language and frameworks.

If you look at the current sample papers, you can actually see some pretty compelling evidence that the IB may continue to carry on asking the wrong questions, unless our voice is heard and we can convince them otherwise.

For example, in the new spec’ papers we see a question:

“Outline one study investigating schema processing.”

Then when we read the marking guide, the guide clearly states: The outline of the study could include aim, method, results of the chosen study to support the response on how the study investigated schema processing.

If that’s what you’re asking and expecting, why not make it clear in the question?!?


What can we do?

I’m a strong advocate for the IB to drop Bloom’s taxonomy as their assessment framework and either adopt Biggs and Collis’ excellent SOLO Taxonomy, or Themantic Education’s own amalgamation of these two taxonomies with our: three levels of learning (Knowing, Understanding, Abstracting).

But convincing the IB of the necessity to make this change will take time and much more evidence than just Min and Taka’s answers. Good things take time and if we can get a critical mass on the OCC to voice our concerns and ideas, change should be inevitable.

In the meantime, if my arguments make sense, there’s nothing stopping you in the classroom adopting ThemEd’s language of learning: e.g. Description shows knowledge, explanation shows understanding. Ignore the command term outline for SAQs as it does nothing but hinder students from showing knowledge and understanding.

Consider using description as an umbrella term that encompasses outlining, stating, defining, summarizing, etc. Let’s keep our feedback simple for kids. How accurate and relevant their descriptions are shows their knowledge; how clear and relevant their explanations are shows understanding.

Those using IB Psychology A Student’s Guide will have plenty of practice at demonstrating understanding through explanation, as this is what every guiding question for each lesson intends to do (here you can see an example from lesson one in Criminology).

The other thing we can do is to let the studies take a backseat in our course, and teach to the ideas instead and use the studies as supporting evidence. I have deliberately excluded “Key Study Boxes” in my textbook because I don’t want students attention diverted towards the studies and taken away from the significance of the studies. They’re in there in as much detail as needed, but they’re intertwined with the ideas.

Our IB Psychology A Revision Guide (due out before Xmas) will include summaries of core concepts and supporting evidence (i.e. studies and theories).

We can also use different rubrics to make our assessments clearer for students. Here’s a first draft of my essay rubric, that I will continue to amend and improve (feedback welcomed).

Activity Idea

Give students copies of the two un-annotated example answers by Min and Taku and see which one they think is better and then ask them to explain why. Also, get them to highlight the explanation in both answers. What you’ll find is that students have an inherent grasp of the difference between explanation and description and it doesn’t take additional teaching.

Final Thought

I want to reiterate that this post is not about pointing the finger and trying to rally pitchforks and torches and to get the villagers storming “the IB.” We are the IB and like any organization, it’s made up of individuals and humans and so it’s subject to human error. But these errors can be fixed. Complaining without a solution is whining, and while I like a good niggle and having a good piss ‘n moan session every now and then, I always aim to offer solutions and practical applications that will work in your classroom. I think at ThemEd we’ve figured out plenty, and this blog and our resources aim to share them with you, too. Of course, if you think I’m way off, speak-up. I love going brain-to-brain over this stuff.

I hope this post will help you and your students find solutions to adopt into your IB Psychology class.

I also hope it will be read by those in the power to make necessary changes and we can start making the necessary steps to improve the quality of our teaching, learning and assessment, not just in IB Psychology, but in all subjects.



*There’s one exception to crossing out describe and writing explain in SAQs- when the question asks to describe a theory or model. In this case, students should give as detailed summary as they can about the theory or model and only if they have time should they explain how it’s related to a study. This is a common pitfall for many students with this type of question.