Multiple-choice exams worsening the maths gender gap

BizBoard

|

GENDER GAP

Boys perform better than girls in tests made up of multiple-choice questions.

Multiple-choice questions are considered objective and easy to mark. But my research shows they give an advantage to males.

I compared around 500,000 test results of boys and girls who sat the same international test, but whose exam papers differed by detail (although not difficulty). The difference included a varied proportion of multiple-choice questions as opposed to open-ended questions.

I found the gender gap in math scores widened with the share of multiple-choice questions in the exam — advantaging males.

This shows the generally better performance of males in maths exams has to do more with the format of the test than their maths knowledge.

How I conducted my research

Standardised exams are widely used to test students and screen job candidates. Australians take several standardised tests throughout their education — such as the NAPLAN, High School Certificate (HSC) and the OECD’s Programme for International Student Assessment (PISA).

Such exams, especially when maths is involved, regularly include multiple-choice questions.

For example, more than 70% of NAPLAN’s 2016 numeracy section was made up of multiple-choice questions. Every year, the maths HSC tests include a section with multiple-choice questions.

These prompt students to identify the correct response from a set of possible answers.

I analysed data from PISA 2012 and 2015. PISA is the largest international standardised test in maths, reading and science. Every three years, more than 500,000 students aged 15, from more than 60 countries, including Australia, take the test.

An analysis of data from the Programme for International Student Assessment (PISA) has found the gender gap in maths tests increased where papers contained more multiple choice questions.

Each student taking the PISA receives a different set of questions which are of similar context and difficulty. But there is a random variation in the proportion of multiple-choice questions each student gets in their test booklet.

For instance, in 2015, some students received an exam mostly made up of multiple-choice questions (70%), while other students’ exam papers contained only 30% multiple-choice questions.

I exploited this random variation in the proportion of multiple-choice questions to investigate how gender differences in maths performance vary.

What I found

Females performed worse than males on multiple-choice questions — this was especially the case when they received an exam booklet with 60% or more multiple-choice questions.

An increase in the share of multiple-choice questions by ten percetage points (such as from 50% to 60%) increased the gender gap in maths scores by 50% in favour of boys.

Why is this happening?

I also analysed how students approached the answers by tracking the time it took them to respond to a question, as well as the number of questions each student skipped.

PISA data allows me to identify students who answer questions too fast (say in under three seconds, which does not allow for careful reading of the question).

Answering questions too fast or skipping them entirely can be seen as a sign of low effort or inattentiveness.

I found a gender difference in the approach students took to answering questions.

Overall, boys were less engaged in the test than girls. They answered questions faster and skipped more of them. However, this difference started to reverse the more multiple-choice questions there were in the test.

Girls who received an exam with more multiple-choice questions were more likely to show a lack of effort than when there were more open-ended questions.

Previous research supports the idea girls can be less engaged with multiple-choice questions. Girls tend to prefer questions that require more analysis and varied solutions while boys are more likely to just state their answers.

Share on facebook
Share on twitter
Share on linkedin