Let’s STOP the research methods madness!

Travis DixonCurriculum, Research Methodology

Our approach to research methods in IB Psych almost makes me as mad as command terms.

“What was the research method of (x’s) study?” Along with, “Is this a good EE question…”, it’s the most common question IB Psych teachers ask. But we shouldn’t have to and we need to put an end to the research methods madness.

Why? Well, let me count the ways (or if you’d rather just read some exam tips, scroll to the bottom).

Firstly, not every study fits neatly into our definitions of “research methods.” In fact, I’d say at a rough guess that less than half of the studies I use in my course nicely fit into the definition of a specific research method. Numerous studies use multiple methods. Many cross boundaries or don’t fit. And this is why you get the recurring, “what method…” question from teachers.

Lamm et al.’s study comparing Cameroonian kids with German kids employed several different methods, highlighting our outdated approach to teaching “research methods” in IB Psychology.

Let’s take Lamm et al.’s cross-cultural study on kids and the marshmallow test I wrote about this week as an example (read more here). In fact, it was this study that inspired this post. They compared two different cultural groups so it could be considered quasi or a natural experiment (and don’t get me started on the lack of difference between those two methods), they used questionnaires and interviews to understand the parenting values of the different cultures and they calculated correlation coefficients between these values and the kids ability to wait for the second treat. So what was the “research method” employed here? It doesn’t bloody well matter.

“So if it doesn’t matter, Travis, why are you getting your undies in a bunch?” I hear you ask. Because the reality is that when it comes to teaching research methodology, studies like this one get ignored because they don’t fit nicely into our boxes. This means kids miss out on a chance to study and learn about how psychologists actually use research methodology in real studies.

Read More:

And where exists a clear definition of what constitutes a “research method?” Are we to say a questionnaire is a research method until the data is used to calculate a correlation coefficient and then it loses its status to be a data gathering technique? Stop the madness.

This absurd argument that some research methods are methods, while others are simply “data gathering techniques” lacks logic and reason. It’s instead based on long-standing and rather outdated notions that lack practical utility. What is an interview, a questionnaire or an observation if it’s not a way to gather data? Why are they superior to brain imaging? Clearly they’re not in the eyes of the IB Psych guide since twin studies and brain imaging are all required methodologies to be taught.

Moreover, in the November 2011 exams brain imaging was considered a research method: “Research methods used at the biological level of analysis could include experiments, case studies, observations, correlational studies and the use of technology to investigate the biological factors, e.g. neuro-imaging.” (IB Psychology Markscheme Nov 2011). But now according to the IB FAQs doc, they’ve lost that privileged status. Why? We’ll never know.

And are lab, field, quasi and natural experiments one method or different methods? Again, in 2011 they were the same (see mark scheme above) but now in 2019 they have magically separated themselves: “…the different research methods for the study of psychology at this level: case studies, naturalistic observations, interviews, experiments, field experiments, quasi-experiments, natural experiments, correlations studies.” (IB FAQs Pg. 8). Perhaps it’s my confirmation bias, but this inconsistency in classification and definition of what is and isn’t a research method shows one plain and simple fact: we need to broaden our approach to teaching and assessing research methodology and leave behind our old stringent rules, for the sake of ourselves and our kids.

Perhaps this is why they used the phrase “approaches to research” in the new curriculum. That was one glimmer of hope but alas, it is now being used synonymously with research method so we’re back to square one.

What is an interview if it’s not a data gathering technique? I’m never understood its privileged position above other methodologies like twin studies and brain imaging.

Our antiquated and culture-bound notion of “research method” prioritizes some methods that are rarely used, like case studies, over others that offer equally if not more value depending on the context, like meta-analyses, twin studies, or even regression analyses.

And don’t even get me started on the contradictions within the IB research methods framework itself. I mention case studies as an accepted “research method” but then we’ve just learnt that actually, despite their long-standing status as research methods (in this guide and in the last), they have lost that status in the last May exams (2019). But don’t worry, they may regain it again soon. Or have they already? I can’t keep up.

Why teach research methods at all? Good question. This is why I get fired up over this topic because I sincerely believe in the value of teaching kids about research methodology and understanding its applications and limitations. A lot of what we teach our kids now will be shown to be false in a matter of time. It’s inevitable. Just look at what’s happening with the replication crisis. This is why we need to understand methodology. Sure I want my kids to remember what I taught them and to use it in their lives, but I also want them to become adults when pondering a question like “should I give my kid ritalin?” or “is a glass of wine OK while I’m pregnant?”, or they’re questioning the use of some new drug to treat their disorder-we-dont-know-of-yet,  they’ll stop to ask, “what does the research say?” And they’ll be able to consult the literature for themselves and make up their own minds. And this is why it’s important to understand research methodology. Otherwise, they may stumble on the first article they find that spouts a correlation of 0.2 as causal and jump to an erroneous conclusion.

I’ll honestly say, I’ve never read a single word from an official IB document about pedagogy that I disagree with. The IBDP is founded on fundamental principles of good teaching that I whole-heartedly agree with. Where my critiques lie is with the practical application of these principles. We see this all too often in IB Psychology. What’s more, the frustration for me is that the problems are so easily fixable, yet they remain unfixed.

Grumbling about problems without offering solutions is just whingeing. I know the tone here is strong, but I’m not just having a piss and moan for no reason. So here’s what I’d like us to do about this madness: first, if the term “research method” is sacrosanct, let’s use “research methodology” instead (shout-out to Alan Law for this idea). Let’s also expand our list of possible methodologies to include things like twin studies, brain imaging, meta-analyses, longitudinal studies, regression analyses and anything else I’m missing. Let’s all agree (can we?) that some of the key concepts we’re trying to get kids to understand is that:

  • there are multiple methods researchers can use in any one study,
  • all have their own strengths and limitations,
  • the choices researchers make will be dependent on the aims and context of the study,
  • and when you’re conducting your own research, you need to carefully consider which methodology to use.

We can teach kids this conceptual understanding far better if we stop the research methods madness and focus on a longer list, less strict definition, and broader understanding of research methodology. (I’ll hasten to add that I’m not advocating for a longer list for Paper 3, but rather a longer list of suitable methods for Papers 1 and 2. What we need for Paper 3 is simply a more definitive and clear-cut list).

Exam Tips

  1. True experiments and correlational studies are the most common methods and they easily fit into our black-and-white boxes. This is why when it comes to prepping for research methods questions for Paper 1 and 2, I encourage students to focus on only these two methods.
  2. Similar to the point above, true experiments and correlational studies can be used to highlight the strengths and limitations of one another, so if a question asks for “Evaluate one…” or “two or more,” students are covered. They will (should?) have a wealth of knowledge to draw on to answer the question.

Read More:

I’ll honestly say, I’ve never read a single word from an official IB document about pedagogy that I disagree with. The IBDP is founded on the fundamental principles of teaching that I whole-heartedly agree with. Where my critiques lie is with the practical application of these principles. We see this all too often in IB Psychology. What’s more, the frustration for me is that the problems are so easily fixable, yet they remain.

Please consider my ideas here if you’re lucky enough to be involved in some way in the next curriculum review. In the meantime, I’ll continue with my approach of research methods exam preparation by focusing only on correlational studies and true experiments. And until we make some changes like I’ve suggested, I make no apologies about it.

As always, I’d love to hear your thoughts in the comments (and feel free to tell me to stop whining and get back to work creating more helpful posts!)