Mapping GCSE Scores

In the UK, August is exam results month for 16-18 year olds. Every year, photos of leaping teenagers clutching their results are accompanied by reports of record attainment rates, debates around how challenging modern exams are and, more so recently than ever, concerns for the number of sixth form and university places. Back in March the full list of the 2010 GCSE results (exams taken by UK 16 year olds [except in Scotland]) were released and I mapped them but never got round to sharing them with anyone. Now seems a good time to do this so here goes…

The map below uses the increasingly popular cartogram method to show the success of students in each Local Authority (LA) across England. The non cartogram version is also shown alongside.

English 09/10 GCSE Score


This is quite a coarse map as England is only split into the 152 LAs and we know there is much greater variation between schools at a local level and even sometimes within individual schools. Moreover, schools on authority borders often serve communities from the areas on either side, limiting the application of LA data to their populations only. Independent (fee-charging) schools are also included in these broad LA results, which is significant when we take into account the predictably higher results of fee-paying pupils and the fact that these schools have not been established with regard for even distribution across the country.  The size of the LA (in school-age population terms) does not seem to have a strong link to the results of its pupils. There must be other factors at play. Concerning known evidence indicates that a pupil’s level of deprivation has a stark impact on his/her attainment. This is supported by the plot for London below that shows the relationship between a borough’s national deprivation rank (known as the index of multiple deprivation or IMD).

London 09/10 GCSE Score and IMD


Another way to show represent this information is by mapping the 2010 GCSE scores for each of the London Boroughs and resizing the borough so that it represents the levels of child poverty (measured by number of under 16s receiving means-tested benefits).

London 09/10 GCSE Score

 

Again, the map above is not perfect as it is still quite generalised and shows only one of the many measures of child poverty that are used. Both maps also show only one measure of attainment the “GCSE or Equivalent” score. The “or Equivalent” bit is important here as it covers a wide range of more vocational qualifications (called NVQs) that are often perceived as less academically challenging and can be a way for students to get the equivalent of 5 A* to C grades including English and maths (a key educational benchmark) without having to be proficient in these core subjects. This is important as schools in England are often ranked by the proportion of their students achieving this benchmark resulting in a possible bias towards the schools offering more vocational subjects and against those offering more challenging ones such as modern languages. It is interesting to consider whether the nature of equivalent qualifications makes them more likely to be used by certain types of school and to explore this further I have produced the plots below. The codes are as follows: AC= Academy, CTC= City Tech. College, CY= Community School, CYS= Community Special School, FD= Foundation School, FDS= Foundation Special School, IND= Registered Independent School, INDSS= Independent Special School, NMSS=Non-Maintained Special School, VA= Voluntary Aided School, VC= Voluntary Controlled School (if you are as baffled about these as I was see here or here).

Impact of Incl./ Excl. GCSE Equiv. on School Ranking


The plot shows 9 regions of England. Each point represents a school in that region and is coloured by its type. On the x-axis is the inverse (higher= better) regional ranking of the school based on its GCSE scores only and on the y-axis is the regional ranking if “equivalents” are included. If the inclusion/ exclusion of  equivalents made no difference to the rankings then the points would follow the grey lines perfectly. In reality we get schools falling either side of this line with those under it benefitting if equivalents are counted and those above benefitting if they are excluded. For example, broadly speaking independent schools (light blue) look worse when GCSE equivalents are used in the ranking criteria and therefore would benefit if such qualifications were excluded. This also seems to be the case for the voluntary controlled schools in pink. Academy Schools (orange) however do much better with the inclusion of equivalent qualifications perhaps reflecting a more vocational emphasis to their curriculum. There are also some interesting regional distinctions with independent schools, for example, in the South West and South East appearing to do well whatever the ranking criteria whilst the East/ West Midlands and the North East present a more mixed picture. I think a lot more can be said about these plots so I would welcome comments!