andallthat.co.uk
  • Blog
  • GCSE / A Level Topics
    • America 1789-1900
    • Antisemitism
    • British Radicalism 1789-1900
    • Crusades
    • Elizabeth I
    • Germany 1919-45
    • Historical Interpretations
    • Historic Environment
    • International Relations 1900-2000
    • Italian Renaissance
    • Medicine
    • Medieval Kings
    • Russia 1855-1921
    • Soviet Russia
    • Politics Files
  • KS3 Topics
    • Interpretations
    • Stand-Alone Lessons
    • NEW KS3 File Store
    • OLD KS3 File Store
    • Student Resources
  • Advice
    • GCSE Options
    • A Level Options
    • Personal Studies
    • University Applications >
      • Choosing Courses
      • General Advice
      • Personal Statements
      • Predicted Grades
  • Teachers
    • YHEP Teach Meet
    • Stand-Alone Lessons >
      • Ancient World
      • Medieval World
      • Early Modern
      • Industrial Revolution
      • Modern World
      • Post-Modern World
    • KS3 Teaching Resources
  • MeetTheHistorians
  • Trips
  • About
    • SubBlog
  • Contact

UPDATED: New History GCSEs: What We Learnt on Results Day Mk2

8/23/2019

18 Comments

 
Picture
Hello everyone. This is an updated version of last year's post on GCSE history results. I have left the previous post in tact but added the new results for 2019 in blue text for those interested and commented where things have changed from 2018. The first version of this post was published on 24 August 2018.
-------
Evening all, I thought I'd take a few minutes to outline five key things we found out about the new History GCSEs. OK, it's really four and a question, but hey ho!

1) Pupils this year did pretty much the same as last year overall
No changes here, and no surprise given the details below. I have updated the chart to show the 2019 results side by side.

This is not really a surprise as Ofqual demanded a statistical tie between 2017 and 2018. Therefore almost the same proportion of kids got a G/1 or greater as last year, and the same for C/4 and A/7. 
Picture
There were some minor differences, but at a school cohort level of say 100 pupils, the difference would have been less than a single pupil missing out on a grade C/4.
Of course, this does not mean everyone’s results will have been stable. It is common with new specifications for some schools to do much better than normal and some to do much worse. This is usually because some schools manage to match what examiners were looking for more closely. It is almost impossible to second guess this precisely in the first year of a specification as examiners refine their expectations during first marking and grading discussions.

Takeaway lesson:
​NO CHANGES HERE

Read the examiners’ reports closely and get some papers back if you were not happy with your overall results.

2) Your choice of board made almost no difference to overall grades (on average)
There is very little change here this year. The distribution of awards per board seem to be fairly static and this reflects the fact that awards are still tied to pupil prior attainment. From this we can therefore infer that centres doing OCR A tend to have cohorts with higher prior attainment and that therefore a greater proportion of higher grades can be awarded.

Discounting the statement at the end of the last point: because the boards all had to adhere to this basic rule when awarding grades, the differences between boards are also non-existent. If you look at the detail you will see that some boards did deviate from the 2017 figures, however this is because they have to take prior attainment into account. So, the reason that OCR A seem to have awarded more 4+ and 7+ grades would suggest that more high attaining pupils took these exams. By contrast OCR B probably awarded slightly fewer 4+ and 7+ grades due to a weaker cohort. This might imply that OCR centres chose their specification based on the ability range of their pupils (though this is pure speculation). AQA and Edexcel pretty much fit the Ofqual model, suggesting they had a broadly representative sample of pupils.
Picture
Takeaway lesson:

​None really!

3) The papers this year continue to be harder than pre 2018 [updated statement]
In comparing raw marks required for different grades, there continues to be a big difference between pre 2018 exams and the current crop. Differences between 2018 and 2019 grade boundaries however were fairly minimal. That said, there were some interesting differences which I have outlined in the charts below.

Given that the proportion of students getting grade 1+, 4+ and 7+ were fixed, the moving grade boundaries tell us a little about how difficult/accessible the papers were:
  • If the grade boundaries go up it suggests pupils found the paper easier
  • If they go down, it suggests the pupils found the paper harder

In the chart below, you can see that in 2017 pupils in history generally needed to get 74% to be awarded a grade A, however in 2018, they needed only 60% of the marks. In 2019 this increased a fraction to 61%. As a rough measure, this suggests the A grade in history was 19% harder/less accessible for pupils in 2018 vs 2017, but "only" 18% harder (!!) in 2019. The big issues are much more evident at Grade C/4 and G/1 where pupils needed just 37% and 6% of the marks on average in 2018 vs. 55% and 18% in 2017. In 2019, the Grade 4 fiure sat at 39% of marks and the Grade 1 at 7% of marks. In essence this makes the 2019 exams 30% harder for pupils to get a Grade 4 than in 2017 (vs 34% harder in 2018) and 60% harder to get a Grade 1 (vs 69% in 2018).
Picture
Now we could argue that this does not matter as the same proportion of students still got the various key grades, as explained above. However, there are some fundamental issues here.

  1. These were meant to be strengthened GCSEs to raise standards. There were around 18 questions between 2 or 3 papers for most of the GCSE specification. On average, each of these questions was worth around 9 marks. A pupil who gained a G grade probably only got around 10 marks in total over these papers. Or in other words, they answered only 1 complete question correctly in 3 papers and over 4 hours of exams. By contrast a pupil in 2017 was getting around 36 marks across their papers, and therefore answering at least 3 whole questions correctly. This does not change substantially in 2019.
  2. We cannot be sure why pupils scored so badly at the lower end, by the fact that marks are so low at Grade 1 boundary (and even Grade 4) suggests that many pupils failed to finish papers, or possibly failed to complete whole papers. For anyone who has worked with pupils who lack confidence, non-finishing and being put off by difficult questions can be a killer in terms of examination success. At the very least, pupils in 2017 were given a chance to demonstrate their knowledge more completely than in 2018. The very low marks at the bottom end also make the issue of fair grading a bit of a crap shoot. It would have been possible for a child to get lucky on a single question (something they had just looked at) and secure a grade 1 without answering a single other question on the paper. Meanwhile a child with weak knowledge and limited confidence might have attempted 3 or 4 questions with a little success and then given up and received the same grade. Again, no real change here, except at board level.

Takeaway lesson:
​
NO CHANGES
Your top end pupils are probably fine with all the changes, but your weaker students are going to struggle. Strategies to finish papers, or answer more questions will be key, alongside boosting core knowledge and building confidence in the face of hard questions.
 
4) Some boards were harder/less accessible than others
I have added some further 2019 analysis to this section

Following on from the above, a closer analysis of the grade boundaries reveals some interesting things. Again, we take the premise that the grades were fixed and therefore lower grade boundaries essentially mean a harder set of papers.

If we look at the charts below we can see that the percentage of marks required for a Grade 7+, 4+ and 1+ were broadly similar across Edexcel and OCR. However, the AQA grade boundaries for both grades 4+ and 7+ are significantly lower in both 2018 and 2019. At grade 7 the difference is 15% and at grade 4 it is around 10%. This would suggest that pupils found the AQA papers noticeably harder/less accessible than pupils taking the other boards despite the additional time offered by AQA in an exam revision for the 2019 series.
Picture
Picture
 Again, this does not affect final grades, but levels of confidence and accessibility are key when working with pupils. The difficulties AQA pupils faced could be down to any number of things from poorly written papers, poorer preparation, unclear mark schemes, poorer marking teams, or just too much content to cover. There is no easy way to know the answer, but the similarities between the other boards paint and stark picture here.

The year on year changes are also quite interesting. Looking at the percentage of marks required for different grades and comparing these, it is possible to look at whether papers have become harder or easier year on year. The picture here suggests little change at Grade 7+. However, at Grade 4+ analysis suggests that pupils studying AQA in 2019 found the papers slight more accessible at Grade 7+ (2% change), and Grade 4+ (9% change). OCR B saw a larger shift at Grade 7+ (3% more accessible) and Grade 4+ (13% more accessible). Edexcel meanwhile saw very little change at Grades 4+ and 7+ in terms of accessibility, while OCR A papers actually became slightly harder to access for pupils. All boards, apart from OCR A were apparently more accessible for the weakest, with shifts from 20% (AQA) to 33% (OCR B) in the positive direction. This means that pupils gaining a Grade 1 in 2019 required 5% of the marks for AQA and 9% of the marks for OCR B. This compares to an average of 18-19% in 2017, so accessiblity is still a major issue for the weakest students!
Picture


Takeaway lesson:

If you are doing AQA and you have a very mixed cohort, or a large number of 3-4 borderline students, this may well not be the right specification for you. Have a think about some of the key points on spec switching. That said, AQA have done a little to address the issues, but not enough to close the gap in 2019. OCR B seem to be offering the most accessible papers in terms of pupils being able to answer the questions set, though this may vary between options and all papers are nowhere near as accessible as in 2017.
 
5) Should I change exam boards?
No major changes here, but I have included a comparison of board sign up below. It would seem very few people changed in the 2018-19 year, though this was to be expected given that the first set of reuslts had not been published when the current cohort began their studies. It also means that OCR's decline in terms of market share remains pronounced.
Picture
OK, so this is more of a question than something we have found out, however it is one a lot of people are asking. It isn’t possible to provide a definitive answer to this, however I hope some of the explanations above help you contextualise your results a bit. If you are still unsure what to do, consider the advice below.
​
Below is a list of possible reasons for your exam results being poorer than in previous years. 
  • Your students might not have approached the exam in the way the examiners intended for any number of reasons.
  • Read the examiner reports closely and request a range of papers. If you can’t afford your own papers, ask the board to send you some scripts from across the range. Keep asking the boards for advice. It never hurts to have at least one department member marking either!
 
  • Your students might not have answered enough questions, or failed to finish papers, as seems to have been common this year.
  • Work on timing and confidence building. Some more realistic grade boundaries will probably help here, but bear in mind these will probably go up next year as people get used to the exam.
 
  • You did not cover the course completely.
  • Rethink your planning in light of the depth of understanding required by the exam. One big issue I have seen a lot is departments taking legacy units and spending too long on things which are now less important in the new specs e.g. ancient medicine. You might also like to consider how conceptual understanding might be supported by a re-worked Key Stage 3, both in terms of knowledge and second order concepts. Hodder are about to publish a new KS3 book to this end. You might also like to read Rich Kennett and my recent article in Teaching History 171 via https://www.history.org.uk/publications/categories/300/resource/9398/teaching-history-171-knowledge 
  • Any number of departmental, school, other issues
  • This happens all the time. If you have been in turmoil, a change of exam board is not always wise. A bit of stability often goes a long, long way in a troubled department or school! Seek out help from schools doing the same spec nearby. Get on any exam board, or other training ASAP. If you are OCR B, don’t forget regional advisors.

If you are still considering changing boards, bear the following points in mind:
Picture
18 Comments
Sue Ellis
8/25/2018 07:26:52 am

As an experienced teacher who has recently moved schools and exam boards, these points are hugely reassuring and helpful, thank you.

Reply
Alex Ford
8/25/2018 08:57:38 am

Happy to help Sue. Good luck with it all!!

Reply
Christine Counsell
8/25/2018 02:35:08 pm

Thanks for this Alex. History departments everywhere will find it very useful.

Reflecting on it all though, it does raise for me two related concerns: (i) the time teachers/depts/schools have to spend trying to choose the best board or spec for their pupils in the interests of better results - it's a lot of speculative work purely about process; (ii) the impossibility of avoiding the ineluctable future pull towards easier papers (however that is dressed up) as boards inevitably compete for market share.

If the market works in the interests of strong history - eg it's driven by the fourth of your very wise 'DO's - then the market (history teacher professional choice) is working usefully. But I fear it will inevitably be compromised by other factors. 'Hoops', as in expected, specific formulae for responses, inevitably arise because they are friends of predictability/reliability, and wider pressures on history depts, trouncing those of judgement about better history (even where they would wish to exercise it) will creep back in, so the old malaise resurfaces. If the market pushes against hoops - a necessary assumption being this kind of rational choice of scholarly/professional integrity in your 'DO' - then great. But will it?

Radical question therefore - might we all be better off (re both fairness & subject rigour) with a single exam board for each subject? I recognise the dangers of that too, but reading your blog, and thinking about how public qualifications work in other jurisdictions, I can't help thinking that we need to keep debating its merits.

Thanks again for a very useful analysis.

Reply
Alex Ford
8/25/2018 04:51:58 pm

Thanks for your thoughts as ever Christine. I have to say I am increasingly drawn to the idea of a single national board. The idea of a market in exams does not really work to the advantage of good history (or other subjects) so I am not really convinced of the benefits of separate boards. Realistically they end up competing on “ease” due to many of the things you have outlined here.

Reply
Henry Walton
8/24/2019 07:00:59 am

Thank you for all this, including the 2019 updates. The continued existence of separate exam boards and efforts by some to encourage switching based on their easiness (whatever they dress it up as) is a disgrace. E.g this on an email received yesterday:
Disappointed with your GCSE History results?
It’s time to explore Eduqas

I know moves towards single exam boards were considered but rejected - when they came up against considerable vested interests and political preference for competition, I think - a few years ago. I would be delighted if wise people, such as those in the HA, could push for reconsideration.

Karen Fairbairn
8/25/2018 09:26:55 pm

Some truly thought provoking comments. After all this I’m more in the better the devil you know frame of mind. I want to see another year before changing exam boards

Reply
Alex Ford
8/26/2018 07:24:22 pm

Hope it all goes well. I think it is probably a wait and see year coming up for most people

Reply
Kalie Dowling
8/26/2018 01:13:53 pm

Hello Alex, thank you for all of the info gathering and analysis you are doing here & on Twitter - it is certainly illuminating.
I thought I would share a comment about the results my cohort achieved as they offer a slightly different slant to the pass rate analysis above...
We use AQA and my top students achieved the 9-7s I was expecting, and my weaker students achieved at least as well as I expected, in fact a number overachieved - including a few who got grade 5 where I had expected a 3.
The group that seem to have skewed the results down for me were those (typically HPA) students who, by their own admission, ‘hoped they had done just about enough’ revision/prep. These students got a 3 (when they really should have been capable of getting a 6) and on the legacy spec they probably would have managed to get a C.
The results from the new AQA spec seem to tell me that these exams will not be very forgiving to those who do it put in the effort - the grades my lot got seem to directly reflect their effort/attitude in the run up to the exams - and in that sense they have certainly become ‘harder’.
The P8 scores should help me understand if my cohort were an anomaly or if there is a bigger pattern.
I do still have confidence in the AQA exam questions, so I am therefore left pondering the best strategies to employ in order to try and mitigate facing a similar outcome next year!
Thank you again for sharing all of this analysis.

Reply
Alex Ford
8/26/2018 07:22:55 pm

That is interesting indeed and thanks for sharing. It would be interesting to see if others found a similar effort->reward pattern. Difficult to quantify and almost impossible to see in the macro data, which is why stories like this are so important

Reply
Tracy Bowen
8/27/2018 09:33:34 am

Yes, we found the same. Students who worked hard and listened to what we were telling them got the grades that they deserved. Some of our weaker ones that we expected may fail managed to scrape in with a grade. We only had one U and he was only one mark off a grade. Our clever but lazy students, some of whom were switched off by the vast content of the new G.C.S.E.s were the ones who under-performed. I marked Power and the People and my colleague marked the Normans and we both found that a very large number of students were leaving questions completely blank. The ones that had written something on average were getting half marks and very few if any were getting into Level 4 (the top level).

Esther
8/25/2019 04:10:23 pm

We do EdExcel and had the same experience. Students who grafted, listened to us about exam technique and did systematic revision were well rewarded. Those who didn't performed badly. Under the old spec motivated to do well in CA and with less content to revise a number of students who achieved 2/3s this year would have previously got Cs. In this regard the new GCSEs are probably better at rewarding the hard working students.

Chloe Watson
8/26/2018 10:52:26 pm

My experiences mirror those of Kalie above, except that we use the Edexcel spec. My higher attainers were fine, some hardworking girls who may have ordinarily got a C on the legacy spec actually overachieved. However, the groups that underachieved massively on this spec were the bright WWBs and the WWGs. The girls gave up several months before the exams, presumably because they realised the amount of effort revision would require with so much content. They prioritised other, more ‘important’ subjects due to the massively increased workload required of all the new GCSEs. This led to them underachieving by several grades. The bright WWBs have traditionally been able to ‘blag’ a C grade to some extent, especially with the old sources paper on OCR which had limited content to revise. They could revise this, were able to access the sources questions and were usually able to pass on the back of this (and of course, controlled assessment). Now, with the huge amount of content to revise, very little of which is actually examined but with no way of limiting revision, although many were bright enough to access the question types, they had no detailed knowledge to back up their answers. This can be seen by the marked difference in performance on Paper 2 which is primarily a knowledge based paper. Unsure how we will get around this in future, other than moving to a 3 year GCSE as we have done which will allow for more in class revision to take place.

Reply
cathy
9/2/2018 03:49:49 pm

Thanks for doing this analysis, Ale. I think when looking at any kind of statistical analysis, one should be very careful & I am always mindful of the quotation that stats are used much like a drunk uses a lampost:for support rather than illumination! Yes, on the statistical face of it, things seem to look the same, standards similar from previous years, etc. etc. The reality at the chalkface is very different. We changed from AQA to Edexcel and it has been nothing short of a disaster. At every grade boundary, the raw score for Edexcel was higher than AQA and Educas by some considerable margin. 66% at AQA would gain a grade 9 whilst a similar score for Edexcel would gain a grade 6. These are big differences and have affected whether good historians of the future now feel they can do A Level. Surely there should be a closer parity between boards? This difference would infer that Edexcel was an easier set of papers but this is not the case in any of the 3 papers pupils had to sit. In fact, the reality was, the papers were so narrow that instead of being a fair test of historical understanding and rigour, they have become a set of lottery questions. This probably accounts for why there were many answers on paper 2 that Tracy and other markers saw. One of my department calculated that pupils were only really tested on 30% of the course. We have spent time reading through the weighty examiners reports - 142 pages for all 3 papers! The fact these reports are so long suggests many things. As Christine mentioned, there are many hoops to be jumped through which would have been nice to know or have some inkling of before the examination papers were taken. I am left wondering why Ofqual only became involved in the Science results and not these differences in raw scores for what is the same subject area. Within my department we have approaching a combined 100 years worth of experience and as a group, we have always advocated the idea that all pupils should study History. The effects of the last 2 years and results day have changed all that.

Reply
Alex Ford
9/2/2018 09:38:54 pm

Agree on the dangers of data Cathy. Things like this only ever give big pictures, not individual cases which will always vary.

That said, as I have hopefully pointed out in the blog, lower grade boundaries are often more of a problem. A grade 9 at 66% means that 35% of the paper was inaccessible except to around 5% of the kids.

The issue of narrow papers is a much, much bigger one for history. For me this goes back to specification writing as anything in there can be a questions. Specs which keep things broad necessitate broad papers. That said, any paper will only ever sample a very tiny section of taught content - it's a result of what we prioritise in UK exams: comparing kids.

I really do sympathise though. I have been in your position in the past when we have changed GCSEs and it is not nice. In positive news, it is likely that things will not get worse next year, and there is a good chance things will get better (see the second blog).

Reply
Kaiden
4/21/2019 05:43:55 pm

Hi, very nice website, cheers!
-----------------------------------------
Need cheap hosting for just $10/year? Or VPS, where plans starts with $6/Mo?

Check here: url16.com/fbwueyblw

Reply
Louise Quigg
8/24/2019 06:11:57 pm

Thank you for this analysis- very thought provoking. I especially liked Christine Counsell’s suggestion about the national board. I find the difference in grade boundaries between exam boards astonishing. Our pupils were required to get 92% for an A*/9, but for AQA it was 68% or thereabouts depending on the option. Surely Ofqual would not allow CCEA to set a paper that much easier than AQA that it would account for a 22% difference? My mind boggles every year at this, your article was insightful. Thank you.

Reply
Nick
8/27/2019 03:32:49 pm

Hi Alex,

thanks so much for this - it is so helpful and makes me feel much more confident about the national picture when having data review conversations, especially when my department focus is the Grade 3 students.

Do you think the analysis suggests that there is value in moving to streamed classes in KS4. Other departments in my school have done it and it seems to have helped. I really struggle with it for a lot of reasons and also feel that the biggest challenge to lower ability pupils is the breadth of content in GCSE. We have to teach everybody everything for them to have a fair chance. A lower ability class would not be able to go slower across the two years because they would not get all the content done.

Sorry if this is out of your purview as it is somewhat school specific and separate to the data. But thanks again for your excellent work - it really does help.

Best wishes
Nick

Reply
DaveW (sorry not using my real name)
9/1/2019 01:55:21 pm

Thanks for the analysis Alex, it's greatly appreciated. I've read what you have had to say with interest.
From my perspective I'm now discounting the idea of jumping ship and moving exam boards. I loathe AQA with a passion that grows daily but cannot justify swapping to any other board given your evidence and the huge investment we've put into trying to make it work. However I now have to adopt a new approach in Key Stage 3 too because we're too popular!

We get lots of middle-lower students flocking to do GCSE and the truth is for many of them the current exam system is too hard. Getting students to do endless practice questions puts them off and demoralises them when they don't get the high marks in year 10 that they were used to in year 9. In the past three years we've had to develop our students' skills in completing 18 questions and many of these were more suitable to A Level, so it has been a real struggle to make these stick (particularly the interpretation skills). Plus we've had to bring in new depths of subject knowledge recall and use of this in explanations which, with ridiculously rigid mark schemes has hurt our kids' confidence. I'm having to sink large parts of this course down into Years 7-9 to reduce the problem of transition to KS4! We're at the point where by January of Year 11 we have to work out how to cram in everything students need for the last unit, do the revision, develop new skills and make sure we complete the course. It's a losing game. The fun element has gone because there's the slog through 4 units in a very short time-scale. Then we get to the point where lesser able kids realise they have two exams on the same day, English AND History or Maths so their priorities lie in other subjects. Basically I go into classrooms and I see kids not waving but drowning in the brutality of the new GCSE course.

I have to inform my colleagues that they should not encourage less able students to do this AQA course. They'll hate it. I hate it too. We send far fewer students onto KS5 study because they don't want more of the same, so in effect the AQA GCSE course (and probably the other boards too) is actually hurting the study of the subject I love teaching.

In 25 years of teaching I've never felt so disenchanted with the exam system. This year all my classes are getting told to do geography.....

Reply



Leave a Reply.

    Tweets by @AndAllThatWeb

    Archives

    March 2019
    February 2019
    January 2019
    August 2018
    July 2018
    January 2018
    December 2017
    June 2017
    May 2017
    February 2017
    January 2017
    November 2016
    October 2016
    September 2016
    July 2016
    June 2016
    May 2016
    April 2016
    March 2016
    February 2016
    January 2016
    November 2015
    October 2015
    August 2015
    July 2015
    June 2015
    May 2015
    April 2015
    March 2015
    February 2015
    January 2015
    November 2014
    October 2014
    June 2014
    May 2014
    March 2014
    February 2014
    January 2014
    December 2013
    November 2013
    October 2013
    July 2013
    June 2013
    April 2013
    February 2013
    December 2012
    October 2012
    September 2012
    July 2012
    June 2012
    April 2012
    March 2012
    March 2011
    February 2011
    June 2000

    Categories

    All
    2014 KS3 Curriculum
    Book Review
    BUSK Reading
    Christmas
    Comment
    History Ancient
    History C19th
    History C20th
    History Early Modern
    History Medieval
    History Thematic
    PGCE
    Rant
    Teachers Assessment
    Teachers Case Study
    Teachers Classroom Management
    Teachers Concepts
    Teachers Conference Notes
    Teachers Curriculum
    Teachers Exams
    Teachers Frideas
    Teachers Government
    Teachers Leadership
    Teachers Misc
    Teachers NQT
    Teachers Pedagogy
    Teachers Planning
    Teachers Progression
    Teachers Purpose
    Teachers Stand Alone Lessons
    Teachers Technology
    Teachers Training
    Teachers Trips
    Teaching Personnel
    Tvfilm Reviews79f7bb4075

    RSS Feed

Powered by Create your own unique website with customizable templates.