[NB. this blog was updated with Q6&7 on 25 August]
Ah, who doesn't love that famously dull election slogan from Warren G Harding? Still it won him the Presidency, so it'll do for me, It has been a long time since I posted a blog here! This is my endeavour to get my brain back into work mode after the summer break. I will hopefully be giving some more updates on the SHP Curriculum PATHS project here soon. More on that here: Curriculum PATHS - Schools History Project If you have followed my blog in the past you may be aware that I did a similar analysis back when the current GCSEs were first examined back in 2018, and again in 2019. The aim was to look at how the new exams compared to the old ones and the potential impact of any changes. You an find it HERE. We then had a long break due to Covid and the ongoing impacts on the awarding of exams. Today I want to revisit some of the questions from 2018 and look to explore what this year's GCSE History results might reveal about some of the main history specifications on offer for GCSE in England. I will be addressing the following questions. Please do scroll down to what interests you. I have included my short takeaway answer as well as a longer analysis.
As ever, I am grateful for any comments or questions you might have, and am happy to chat further about any of this. These days you can find me over at Bluesky @apf102.bsky.social on Bluesky. Full disclosure: as an SHP Fellow, I am of course connected with the OCR History B (SHP) specification. That said, I and the SHP Fellows do not set the papers, this is done by OCR. The aim here (as you will hopefully see) is to offer an honest analysis of the results for HoDs and others interested in exploring the bigger picture of the history exams over the past few years. Finally, I want to say a huge thank you to AQA who put all their data in XLS format, making this job infinitely easier!! Read on for more...
0 Comments
Hello everyone. This is an updated version of last year's post on GCSE history results. I have left the previous post in tact but added the new results for 2019 in blue text for those interested and commented where things have changed from 2018. The first version of this post was published on 24 August 2018. ------- Evening all, I thought I'd take a few minutes to outline five key things we found out about the new History GCSEs. OK, it's really four and a question, but hey ho! 1) Pupils this year did pretty much the same as last year overall No changes here, and no surprise given the details below. I have updated the chart to show the 2019 results side by side. This is not really a surprise as Ofqual demanded a statistical tie between 2017 and 2018. Therefore almost the same proportion of kids got a G/1 or greater as last year, and the same for C/4 and A/7. There were some minor differences, but at a school cohort level of say 100 pupils, the difference would have been less than a single pupil missing out on a grade C/4.
Of course, this does not mean everyone’s results will have been stable. It is common with new specifications for some schools to do much better than normal and some to do much worse. This is usually because some schools manage to match what examiners were looking for more closely. It is almost impossible to second guess this precisely in the first year of a specification as examiners refine their expectations during first marking and grading discussions. Takeaway lesson: NO CHANGES HERE Read the examiners’ reports closely and get some papers back if you were not happy with your overall results. 2) Your choice of board made almost no difference to overall grades (on average) There is very little change here this year. The distribution of awards per board seem to be fairly static and this reflects the fact that awards are still tied to pupil prior attainment. From this we can therefore infer that centres doing OCR A tend to have cohorts with higher prior attainment and that therefore a greater proportion of higher grades can be awarded. Discounting the statement at the end of the last point: because the boards all had to adhere to this basic rule when awarding grades, the differences between boards are also non-existent. If you look at the detail you will see that some boards did deviate from the 2017 figures, however this is because they have to take prior attainment into account. So, the reason that OCR A seem to have awarded more 4+ and 7+ grades would suggest that more high attaining pupils took these exams. By contrast OCR B probably awarded slightly fewer 4+ and 7+ grades due to a weaker cohort. This might imply that OCR centres chose their specification based on the ability range of their pupils (though this is pure speculation). AQA and Edexcel pretty much fit the Ofqual model, suggesting they had a broadly representative sample of pupils. Note: my final blog offering suggestions for real examination reform is now live here.
In my previous blog I looked at the ways in which the marking of examinations in England is particularly problematic in subjects like History and English. For a full review of this, you may like to read Ofqual’s blog and report on the subject. Today I want to deal with the question of what action the educational establishment might take. GCSE and A-Level examinations are still being held up vital to the education system. The impetus has been to seek to cement their place as the “gold standard” both nationally and internationally. If we want this to be true in reality, I wonder if we need a more fundamental rethink of what examinations (especially GCSE examinations) are for and therefore what they might be in an educational landscape which is vastly different from that of the late 1980s when GCSEs were first introduced. What are GCSE examinations for? The seemingly simple question of the purpose of GCSE examinations is actually very complex indeed. But of course, as with all assessment, validity is heavily connected to the inferences one wants to draw from an exam (Wiliam, 2014). The range of inferences which are suggested as valid from a set of GCSE examinations are extremely diverse: By now many of you will be considering what you will be teaching for the new GCSE units, which are launching in September 2016. The less fortunate of you may even be teaching them already, despite the fact the specification documents are still in draft; but that is an issue for another day. One thing you will certainly have noticed if you have begun the process of choosing already, is that there are now an extra two units for students to cover in their two (or three!!) years. To recap, students now have to study:
One of the most important tasks for history departments over the next few months will be narrowing down and choosing which specification best fits your students, expertise, interests and (sadly) resources (again, I might make this a future blog). Once you have decided on a suitable route, you can then think about mapping out how you will cover each of the units in the 10-12 weeks allocated by the new specification materials. This is also a good way to test specifications as some certainly have an awful lot of content to cover! I have already written about the process of unit planning for the new A Level HERE and HERE, highlighting the importance of excellent subject knowledge in planning meaningful units. I will not repeat that, but if you are considering issues of planning for GCSE then these posts would be a good starting point. The one worry I hear a lot with the revised GCSE, is that it demands a lot of content knowledge and may be inaccessible for weaker students. I therefore want to spend the rest of this post exploring these claims and considering how we might respond as history teachers who want every child to be able to access and enjoy really great history. So today's post is not really a focus on awesome 90s dance music, but it is a follow-up to Rich Kennett's excellent blog on preparing a scheme of work for the new A Levels using enquiry questions. This is a process which I have been engaged in now for a few months. What strikes me is how much I have had to think about how I want to structure the course, even though it is largely (but not entirely) similar to our current A Level. What I have decided is that planning for developing knowledge is, as the Urban Cookie Collective would (probably) say, both the key and the secret to planning a great A Level course.
To give you some context. We currently teach AQA's AS unit on Russia 1855-1917. This feeds into an A2 on the USSR 1941-1991. Under the A Level changes we have opted to go for the new unit which covers 1855-1964 (splitting at 1917 for the purposes of AS examination). On the surface this seemed like the obvious choice and the specification document seemed to bear this out - covering areas we were already comfortable with teaching. However it didn't take too long before we started encountering issues in planning the AS portion of the unit. So yesterday I delivered a session about progression in History in a post-levels world at the Leeds Learning Network History Conference. The most valuable part of the day for me, other than the excellent sessions, was the chance to speak to other heads of department about how their schools were approaching the issue of assessment and reporting in Key Stage 3. The major worry across the board was that we end up replacing the Levels system with something essentially identical. In the worst cases, schools seemed set on keeping Levels and not updating assessment and reporting arrangements at all. The main reason for this seems to come down to the age old issue (or is that actually a recent issue) of accountability.
This got me to thinking. Is there any point in replacing our progression systems if we end up keeping a debunked system of assessment and reporting? Now I completely accept that schools are (in our current culture) going to have to show evidence of pupil progress, as it forms a major part of the Ofsted framework. However, I think there may be some ways we can make something which both satisfies the need for data reporting and allows us to develop and use our meaningful models of progression which we have been crafting over the last few months. Once again, I would like to thank Helen Snelson at the Mount and Michael Fordham at Cambridge for their inspiration on these issues! What is crucial for me, as for many others, is that we don't let our progression revolution die a death at the hands of data systems wedded to an outmoded way of thinking. For a brief moment, I genuinely thought that I might get through this week without reading anything too upsetting about education in the news. A few days ago, Mr Gove seemed to switch his attention to private schools, attacking them as 'islands of privilege'. Whilst yesterday the Secretary of State for Education was forced to back down on his plans to reform school teachers pay and conditions. The STRB enacted a full sweep of humiliating defeats against Mr Gove's plans to change everything from the school day to forced extra-curricular activities.
Of course, this couldn't last too long. This morning I awoke to news which nearly had me choking on my cornflakes. The exams regulator Ofqual has apparently decided that the fact so many schools are sending exam scripts for re-marking is because we are all busy manipulating our A*-C pass rate. To quote the review paper specifically, Ofqual stated that “A high volume of enquiries about results are, we believe, motivated by a speculative attempt to improve results...” The AndAllThat.co.uk teacher blog is moving from WordPress onto the main website. From now on you will find non-topic related content here. You can still access the archives from the WordPress site by visiting http://andallthatweb.wordpress.com . I will endeavour to transfer the content over the next few months. Mr F I have to say that writing about examinations does not rank amongst my favourite pass times, yet as the GCSE fiasco has emerged, I have found myself constantly asking, who is really surprised? Every teacher surely has felt the pain of results day when you have no idea if your results are your doing, their doing or the doing of an examinations committee…. Accountability? You must be joking. I sat and longingly read the details of the Queensland examinations system which puts schools at the centre…I took the quote from the opening page of the document:“It cannot be over-emphasised that the mode of assessment dictates the nature of the educational experience and the quality of the relationship between teacher and pupils. Assessment is not something separate ??? a tool ??? by which education may be evaluated; it acts upon the educational system so as to shape it in accordance with what the assessment demands. You cannot have, at one and the same time, education for personal growth and a totally impersonal system of assessment. Assessment should be a bond between teachers and taught, not something which threatens and antagonises.
To humanise assessment, then, we have to make of schooling a more co- operative enterprise between teachers and pupils, and an opportunity to develop the whole range of human competencies, leading up to informative profiles. This should be the pattern of things for the immediate future; it is the way to shed the dreary, and often unjust, grading techniques of traditional education.Hemming (1980, p. 113???14)” …and then I wrote this: |
Image (c) LiamGM (2024) File: Bayeux Tapestry - Motte Castle Dinan.jpg - Wikimedia Commons
Archives
August 2024
Categories
All
|