Hello everyone. This is an updated version of last year's post on GCSE history results. I have left the previous post in tact but added the new results for 2019 in blue text for those interested and commented where things have changed from 2018. The first version of this post was published on 24 August 2018. ------- Evening all, I thought I'd take a few minutes to outline five key things we found out about the new History GCSEs. OK, it's really four and a question, but hey ho! 1) Pupils this year did pretty much the same as last year overall No changes here, and no surprise given the details below. I have updated the chart to show the 2019 results side by side. This is not really a surprise as Ofqual demanded a statistical tie between 2017 and 2018. Therefore almost the same proportion of kids got a G/1 or greater as last year, and the same for C/4 and A/7. There were some minor differences, but at a school cohort level of say 100 pupils, the difference would have been less than a single pupil missing out on a grade C/4. Of course, this does not mean everyone’s results will have been stable. It is common with new specifications for some schools to do much better than normal and some to do much worse. This is usually because some schools manage to match what examiners were looking for more closely. It is almost impossible to second guess this precisely in the first year of a specification as examiners refine their expectations during first marking and grading discussions. Takeaway lesson: NO CHANGES HERE Read the examiners’ reports closely and get some papers back if you were not happy with your overall results. 2) Your choice of board made almost no difference to overall grades (on average) There is very little change here this year. The distribution of awards per board seem to be fairly static and this reflects the fact that awards are still tied to pupil prior attainment. From this we can therefore infer that centres doing OCR A tend to have cohorts with higher prior attainment and that therefore a greater proportion of higher grades can be awarded. Discounting the statement at the end of the last point: because the boards all had to adhere to this basic rule when awarding grades, the differences between boards are also non-existent. If you look at the detail you will see that some boards did deviate from the 2017 figures, however this is because they have to take prior attainment into account. So, the reason that OCR A seem to have awarded more 4+ and 7+ grades would suggest that more high attaining pupils took these exams. By contrast OCR B probably awarded slightly fewer 4+ and 7+ grades due to a weaker cohort. This might imply that OCR centres chose their specification based on the ability range of their pupils (though this is pure speculation). AQA and Edexcel pretty much fit the Ofqual model, suggesting they had a broadly representative sample of pupils. Takeaway lesson: None really! 3) The papers this year continue to be harder than pre 2018 [updated statement] In comparing raw marks required for different grades, there continues to be a big difference between pre 2018 exams and the current crop. Differences between 2018 and 2019 grade boundaries however were fairly minimal. That said, there were some interesting differences which I have outlined in the charts below. Given that the proportion of students getting grade 1+, 4+ and 7+ were fixed, the moving grade boundaries tell us a little about how difficult/accessible the papers were:
In the chart below, you can see that in 2017 pupils in history generally needed to get 74% to be awarded a grade A, however in 2018, they needed only 60% of the marks. In 2019 this increased a fraction to 61%. As a rough measure, this suggests the A grade in history was 19% harder/less accessible for pupils in 2018 vs 2017, but "only" 18% harder (!!) in 2019. The big issues are much more evident at Grade C/4 and G/1 where pupils needed just 37% and 6% of the marks on average in 2018 vs. 55% and 18% in 2017. In 2019, the Grade 4 fiure sat at 39% of marks and the Grade 1 at 7% of marks. In essence this makes the 2019 exams 30% harder for pupils to get a Grade 4 than in 2017 (vs 34% harder in 2018) and 60% harder to get a Grade 1 (vs 69% in 2018). Now we could argue that this does not matter as the same proportion of students still got the various key grades, as explained above. However, there are some fundamental issues here.
Takeaway lesson: NO CHANGES Your top end pupils are probably fine with all the changes, but your weaker students are going to struggle. Strategies to finish papers, or answer more questions will be key, alongside boosting core knowledge and building confidence in the face of hard questions. 4) Some boards were harder/less accessible than others I have added some further 2019 analysis to this section Following on from the above, a closer analysis of the grade boundaries reveals some interesting things. Again, we take the premise that the grades were fixed and therefore lower grade boundaries essentially mean a harder set of papers. If we look at the charts below we can see that the percentage of marks required for a Grade 7+, 4+ and 1+ were broadly similar across Edexcel and OCR. However, the AQA grade boundaries for both grades 4+ and 7+ are significantly lower in both 2018 and 2019. At grade 7 the difference is 15% and at grade 4 it is around 10%. This would suggest that pupils found the AQA papers noticeably harder/less accessible than pupils taking the other boards despite the additional time offered by AQA in an exam revision for the 2019 series. Again, this does not affect final grades, but levels of confidence and accessibility are key when working with pupils. The difficulties AQA pupils faced could be down to any number of things from poorly written papers, poorer preparation, unclear mark schemes, poorer marking teams, or just too much content to cover. There is no easy way to know the answer, but the similarities between the other boards paint and stark picture here. The year on year changes are also quite interesting. Looking at the percentage of marks required for different grades and comparing these, it is possible to look at whether papers have become harder or easier year on year. The picture here suggests little change at Grade 7+. However, at Grade 4+ analysis suggests that pupils studying AQA in 2019 found the papers slight more accessible at Grade 7+ (2% change), and Grade 4+ (9% change). OCR B saw a larger shift at Grade 7+ (3% more accessible) and Grade 4+ (13% more accessible). Edexcel meanwhile saw very little change at Grades 4+ and 7+ in terms of accessibility, while OCR A papers actually became slightly harder to access for pupils. All boards, apart from OCR A were apparently more accessible for the weakest, with shifts from 20% (AQA) to 33% (OCR B) in the positive direction. This means that pupils gaining a Grade 1 in 2019 required 5% of the marks for AQA and 9% of the marks for OCR B. This compares to an average of 18-19% in 2017, so accessiblity is still a major issue for the weakest students! Takeaway lesson: If you are doing AQA and you have a very mixed cohort, or a large number of 3-4 borderline students, this may well not be the right specification for you. Have a think about some of the key points on spec switching. That said, AQA have done a little to address the issues, but not enough to close the gap in 2019. OCR B seem to be offering the most accessible papers in terms of pupils being able to answer the questions set, though this may vary between options and all papers are nowhere near as accessible as in 2017. 5) Should I change exam boards? No major changes here, but I have included a comparison of board sign up below. It would seem very few people changed in the 2018-19 year, though this was to be expected given that the first set of reuslts had not been published when the current cohort began their studies. It also means that OCR's decline in terms of market share remains pronounced. OK, so this is more of a question than something we have found out, however it is one a lot of people are asking. It isn’t possible to provide a definitive answer to this, however I hope some of the explanations above help you contextualise your results a bit. If you are still unsure what to do, consider the advice below.
Below is a list of possible reasons for your exam results being poorer than in previous years.
If you are still considering changing boards, bear the following points in mind:
20 Comments
Sue Ellis
8/25/2018 07:26:52 am
As an experienced teacher who has recently moved schools and exam boards, these points are hugely reassuring and helpful, thank you.
Reply
Alex Ford
8/25/2018 08:57:38 am
Happy to help Sue. Good luck with it all!!
Reply
Christine Counsell
8/25/2018 02:35:08 pm
Thanks for this Alex. History departments everywhere will find it very useful.
Reply
Alex Ford
8/25/2018 04:51:58 pm
Thanks for your thoughts as ever Christine. I have to say I am increasingly drawn to the idea of a single national board. The idea of a market in exams does not really work to the advantage of good history (or other subjects) so I am not really convinced of the benefits of separate boards. Realistically they end up competing on “ease” due to many of the things you have outlined here.
Reply
Henry Walton
8/24/2019 07:00:59 am
Thank you for all this, including the 2019 updates. The continued existence of separate exam boards and efforts by some to encourage switching based on their easiness (whatever they dress it up as) is a disgrace. E.g this on an email received yesterday:
Karen Fairbairn
8/25/2018 09:26:55 pm
Some truly thought provoking comments. After all this I’m more in the better the devil you know frame of mind. I want to see another year before changing exam boards
Reply
Alex Ford
8/26/2018 07:24:22 pm
Hope it all goes well. I think it is probably a wait and see year coming up for most people
Reply
Kalie Dowling
8/26/2018 01:13:53 pm
Hello Alex, thank you for all of the info gathering and analysis you are doing here & on Twitter - it is certainly illuminating.
Reply
Alex Ford
8/26/2018 07:22:55 pm
That is interesting indeed and thanks for sharing. It would be interesting to see if others found a similar effort->reward pattern. Difficult to quantify and almost impossible to see in the macro data, which is why stories like this are so important
Reply
Tracy Bowen
8/27/2018 09:33:34 am
Yes, we found the same. Students who worked hard and listened to what we were telling them got the grades that they deserved. Some of our weaker ones that we expected may fail managed to scrape in with a grade. We only had one U and he was only one mark off a grade. Our clever but lazy students, some of whom were switched off by the vast content of the new G.C.S.E.s were the ones who under-performed. I marked Power and the People and my colleague marked the Normans and we both found that a very large number of students were leaving questions completely blank. The ones that had written something on average were getting half marks and very few if any were getting into Level 4 (the top level).
Esther
8/25/2019 04:10:23 pm
We do EdExcel and had the same experience. Students who grafted, listened to us about exam technique and did systematic revision were well rewarded. Those who didn't performed badly. Under the old spec motivated to do well in CA and with less content to revise a number of students who achieved 2/3s this year would have previously got Cs. In this regard the new GCSEs are probably better at rewarding the hard working students.
Chloe Watson
8/26/2018 10:52:26 pm
My experiences mirror those of Kalie above, except that we use the Edexcel spec. My higher attainers were fine, some hardworking girls who may have ordinarily got a C on the legacy spec actually overachieved. However, the groups that underachieved massively on this spec were the bright WWBs and the WWGs. The girls gave up several months before the exams, presumably because they realised the amount of effort revision would require with so much content. They prioritised other, more ‘important’ subjects due to the massively increased workload required of all the new GCSEs. This led to them underachieving by several grades. The bright WWBs have traditionally been able to ‘blag’ a C grade to some extent, especially with the old sources paper on OCR which had limited content to revise. They could revise this, were able to access the sources questions and were usually able to pass on the back of this (and of course, controlled assessment). Now, with the huge amount of content to revise, very little of which is actually examined but with no way of limiting revision, although many were bright enough to access the question types, they had no detailed knowledge to back up their answers. This can be seen by the marked difference in performance on Paper 2 which is primarily a knowledge based paper. Unsure how we will get around this in future, other than moving to a 3 year GCSE as we have done which will allow for more in class revision to take place.
Reply
cathy
9/2/2018 03:49:49 pm
Thanks for doing this analysis, Ale. I think when looking at any kind of statistical analysis, one should be very careful & I am always mindful of the quotation that stats are used much like a drunk uses a lampost:for support rather than illumination! Yes, on the statistical face of it, things seem to look the same, standards similar from previous years, etc. etc. The reality at the chalkface is very different. We changed from AQA to Edexcel and it has been nothing short of a disaster. At every grade boundary, the raw score for Edexcel was higher than AQA and Educas by some considerable margin. 66% at AQA would gain a grade 9 whilst a similar score for Edexcel would gain a grade 6. These are big differences and have affected whether good historians of the future now feel they can do A Level. Surely there should be a closer parity between boards? This difference would infer that Edexcel was an easier set of papers but this is not the case in any of the 3 papers pupils had to sit. In fact, the reality was, the papers were so narrow that instead of being a fair test of historical understanding and rigour, they have become a set of lottery questions. This probably accounts for why there were many answers on paper 2 that Tracy and other markers saw. One of my department calculated that pupils were only really tested on 30% of the course. We have spent time reading through the weighty examiners reports - 142 pages for all 3 papers! The fact these reports are so long suggests many things. As Christine mentioned, there are many hoops to be jumped through which would have been nice to know or have some inkling of before the examination papers were taken. I am left wondering why Ofqual only became involved in the Science results and not these differences in raw scores for what is the same subject area. Within my department we have approaching a combined 100 years worth of experience and as a group, we have always advocated the idea that all pupils should study History. The effects of the last 2 years and results day have changed all that.
Reply
Alex Ford
9/2/2018 09:38:54 pm
Agree on the dangers of data Cathy. Things like this only ever give big pictures, not individual cases which will always vary.
Reply
Kaiden
4/21/2019 05:43:55 pm
Hi, very nice website, cheers!
Reply
Louise Quigg
8/24/2019 06:11:57 pm
Thank you for this analysis- very thought provoking. I especially liked Christine Counsell’s suggestion about the national board. I find the difference in grade boundaries between exam boards astonishing. Our pupils were required to get 92% for an A*/9, but for AQA it was 68% or thereabouts depending on the option. Surely Ofqual would not allow CCEA to set a paper that much easier than AQA that it would account for a 22% difference? My mind boggles every year at this, your article was insightful. Thank you.
Reply
Nick
8/27/2019 03:32:49 pm
Hi Alex,
Reply
DaveW (sorry not using my real name)
9/1/2019 01:55:21 pm
Thanks for the analysis Alex, it's greatly appreciated. I've read what you have had to say with interest.
Reply
3/14/2022 10:22:25 am
I very much appreciate it. Thank you for this excellent article. Keep posting!
Reply
Leave a Reply. |
Image (c) LiamGM (2024) File: Bayeux Tapestry - Motte Castle Dinan.jpg - Wikimedia Commons
Archives
August 2024
Categories
All
|