My feed
Premium

Please
or
to access all these features

Join the discussion on our Education forum.

Education

league table lies

41 replies

coldrain2018 · 13/09/2018 22:54

Due to frustration reading posts from people that believe league tables are a genuine indicator of the quality of the school, I'm starting a list of ways they lie.

I have got many examples, but its late, and I'm tired, so I'll only do a few tonight, and will come back tomorrow.

if anyone wants to add their lies, please feel free

OP posts:
Report
coldrain2018 · 13/09/2018 22:57

The single main lie is the whole idea that the statistical analysis done on school results is valid in the first place. It isn't. If you attempted to publish such results in a peer reviewed science publication, you would be rejected for lack of robust statistical analysis. The conclusion drawn from the statistics used on schools would not be considered conclusions that could be drawn from such statistics in science.

OP posts:
Report
gallicgirl · 13/09/2018 22:59

Surely the conclusions are in the eye of the beholder?

Report
coldrain2018 · 13/09/2018 23:07

another is the choice of students entered for exams. This has no bearing on the suitability of exams for students, decisions are based on looking at the student as a whole, and deciding how that student could best reflect the school in their results.

you therefore get such things as children who are going to get a 1 or 2 in their foreign language not being allowed to give it up to concentrate on other subjects, because its best for the school that they enter a language and score anything at all, rather than get no language but score better in their other subjects. Its not better for the child, its better for the school.

Then we get children who are entered a year before they should be, late arrivals in the country, non English speakers, who will be entering the GCSE resit year in sixth form, and taking GCSEs a year late.

you would think such students would benefit from being placed in year 10, rather than year 11, wouldn't you? so they can start the GCSE courses from the beginning, and spend the year learning English and learning GCSE subjects, then moving straight to the GCSE resit year, post 16, and doing the year 11 work there.

WRONG! of course it is best for the students, but not for the school! having 16 year olds not sit GCSE exams mean they count as 0s in the school stats, so stick them straight into GCSE, hope they get a couple of 1s or 2s, don't worry that they don't have aclue what is going on, there is always a chance they could hit some right answers in multiple choice..... then throw them into the sixth form resit year to do the year 11 work there, without ever having touched the year 10 work once.

Some schools have 10-20 students in this category. Decent schools ignore the hit they take on the league tables and teach these children with year 10. Most schools stick them into year 11 to sink.

OP posts:
Report
noblegiraffe · 13/09/2018 23:07

Progress 8. The lie that a school with a progress 8 score of -0.1 is worse than a school with a score of 0.1.

When you take confidence intervals into account, you can’t actually tell the difference between them with any reliability. The first school could actually be better.

Report
noblegiraffe · 13/09/2018 23:13

Off-rolling. Loads of kids who were destined to do badly disappear from schools and their league tables.

www.tes.com/news/sharp-rise-missing-pupils-fuels-rolling-fears

Report
FanDabbyFloozy · 14/09/2018 10:22

The "external candidate" phenomenon - there was a mum posting here earlier in the year whose independent school had registered her DD to sit GCSEs as an external pupil, so as not to muck up their results. (The child had had some health problems IIRC).

The "can't pick your A Levels" phenomenon - top schools who don't allow children to pick their own A Levels, instead "guiding" them to the subject they will get top marks in. A woman told me her son had been guided away from a subject, despite getting A* at GCSE.

The "cull" - fewer places at A Levels so as to weed out weaker pupils.

The Twitter game.. Schools that tweet that their top pupils for 9x 9s, 3 X 8s etc. and then list 4 pupils who got these between them.

I have much more respect for the schools that don't play these games.

Report
coldrain2018 · 14/09/2018 10:27

The ruthless selection at sixth form makes me sick

children with health problems - out

children with SEN - out

children with unstable family situations which might mean they end up moving while on the course - out

children who had an unlucky day in the GCSEs but should be given the benefit of the doubt - out

children not making their targets in the first year - out

This is how comprehensive intake schools shine in their 6th form stats, they are are so excessively selective . I know this becasue we are a "sink" school that offers anyone a sixth form place, and take students who have been rejected from all sorts of other places for all sorts of reasons

OP posts:
Report
coldrain2018 · 14/09/2018 10:30

cooking the targets is another good one, I've been in schools which love an intake year with a lot of students who don't have records... you can well and truly mess up their targets to be so low the school can excede them by several grades, does wonders for the league table to chuck a few like that into the statistics.

Of course there are other ways of ensuring your intake has lower than realistic target grades too

OP posts:
Report
coldrain2018 · 14/09/2018 10:33

entering weaker students for course which are course work heavy, then doing the course work for them

OP posts:
Report
coldrain2018 · 14/09/2018 10:34

This is less common now, but certainly lots of league tables from the past 20 years have depended on this

OP posts:
Report
HarveySchlumpfenburger · 14/09/2018 10:38

The odd thing in the ks2 progress score where children with a very low APS at ks1 are considered to be a homologous group that will have the same trajectory. In fact, they’re mainly split between children with SEN and children in the very early stages of speaking English. Subsequently the children with SEN end up with huge negative progress scores and the children with EAL end up with huge positive progress scores.

Report
coldrain2018 · 14/09/2018 10:40

putting students in for pointless easy exams, which count for nothing other than scores in the league tables, ALAN tests, always were my particular bug bear - the standard was HALF that of a GCSE, but if you did one in English and one in maths, and got half a GCSE in each, that counted as having a GCSE in English and maths!!! thank fully, no more, but other similar cons spring up all the time

OP posts:
Report
coldrain2018 · 14/09/2018 10:44

English literature..... honestly, the number of poor kids I have seen struggle with this, when academically it is way beyond them, useless to them, and their time would be far better spent concentrating on other exams.

because the grade boundaries are lower, because it is easier to fluke a 4 than in English language, if you force unsuitable candidates through this, it more than doubles the chance they will score as "passed an English exam" on the school league tables, even if functionally illiterate.

OP posts:
Report
FanDabbyFloozy · 14/09/2018 11:11

Independent schools that take iGCSEs instead of GCSEs, although that particular ruse seems to have had its day.

Although a very unpopular view on MN, I think the move away from coursework for academic subjects has lessened the opportunity for such games.

Report
RedSkyLastNight · 14/09/2018 12:52

Restricting the number of students that can take perceived "hard" GCSEs.

School near us only allows the top 25% of the year to take triple science.
At my DC's (lower ability intake) school it's close to 50%.

Report
roguedad · 14/09/2018 21:01

noblegiraffe already said this but giving uncertainty intervals that reflect the size of the cohort and hence the noise in the estimate is really important. There was some research on this in a medical context a while back that made a nonsense of medical league tables. (Spiegelhalter at Cambridge I think).

The bias in the cohort is also important, and there are similar issues with schools taking more weak kids just as with surgeons willing to take on tough cases looking like they have poorer mortality.

One stat that really annoys me is the % of kids getting 5A*-C including Maths and English (or its translation to the new numerical grades). This is hideously sensitive to the degree of overlap of the sets of kids who are weak in English and weak in maths. I've seen schools claim improvements in this stat when in fact they've just had a cohort with more of the weak kids showing weakness in both subjects rather than just one, but keeping similar pass rates in maths and English. For example, you could have 60% pass rates in both subjects, but that bit of data could in theory range between 20% (kids weak in different subjects) to 60% (kids weak in same subjects). In practice it does not do that, to quite that extreme, but you get random fluctuations in the overlap that are easily misrepresented as improvement.

Report
TeenTimesTwo · 15/09/2018 09:47

So OP. What is your solution? Go back to parents having no information on exam success at all? where is the accountability for schools or the openness so parents can make choices?

A simple example - the y1 phonics screening test.

The government had to mandate this due to the hundreds of schools who, despite evidence, were still not teaching phonics properly, so children were being failed by not being taught how to decode unfamiliar words. The screening test showed up those failing schools so they had to improve their methods.

Yes, statistics aren't the be all and end all in any way. But they form part of a picture. Yes there are ways of massaging the results (you get what you measure), but the way to deal with this is surely not just to say 'don't bother'?

(Though more understanding of statistics would definitely help the general population).

Report
noblegiraffe · 15/09/2018 09:57

Teen have you seen the graph of data for the phonics check? Does it highlight failing schools or those who have administered the check correctly?

Getting schools to administer a test that they are then judged on is obviously flawed.

In fact the phonics check is an excellent example of how this sort of data collection is junk.

schoolsweek.co.uk/phonics-check-needs-rethink-after-data-shows-something-dodgy/

league table lies
Report
TeenTimesTwo · 15/09/2018 10:09

noble So is the fact that teachers cheat on a test (and people wonder why sometimes teachers aren't as respected as they should be!) a reason to not even attempt to measure?

I agree the jump just at the pass point is highly dodgy (and highly depressing). However don't you think that by at least measuring you are getting parents to be more aware of the importance of phonics, and this hopefully ask questions generally when choosing primary schools?

Or am I hopelessly optimistic?

Report
TeenTimesTwo · 15/09/2018 10:19

I guess, what I'm trying to say is that I use statistics to inform my questions of the school, not to make snap judgements.

However, maybe too many people take them at face value and unquestionably think that 86% is better than 84%, and that's where it all goes wrong.

Report
noblegiraffe · 15/09/2018 10:21

Teen If you have a high stakes test, the results of which are published, then it will distort the education around that test, like a giant turd on a trampoline.

If the DfE want to check if phonics are being implemented in schools then there are ways to do it that doesn’t involve teachers in fear of their jobs/pay assessing their own pupils.

League tables are another turd. The DfE has discussed scrapping them, but the conclusion is that other institutions (e.g. newspapers) would gather data and publish their own tables. By publishing their own tables, at least they have control over what is in them.

Progress 8 crap though. Other options were put forward for presenting the progress of schools to parents (e.g. graphs of progress for centiles of pupils) but it was thought that parents wouldn’t understand them. So something statistically dubious was used instead Hmm

Report
TeenTimesTwo · 15/09/2018 10:29

How would you check phonics teaching then?

My understanding is that when they were brought in a lot of teachers said they were unnecessary because they were good teachers who were doing this kind of things anyway. Then the tests were done and we started hearing about how good readers fail the test because they have gone 'beyond phonics'. Hmm

Anyway, sorry. Without data, how do we know whether your school's maths teaching is noticeably better or worse than school across town?

Report

Don’t want to miss threads like this?

Weekly

Sign up to our weekly round up and get all the best threads sent straight to your inbox!

Log in to update your newsletter preferences.

You've subscribed!

noblegiraffe · 15/09/2018 10:39

Depends on if you want to check individual school compliance or national compliance. Some sort of random sampling? External assessors? Part of Ofsted? It will cost more money than getting teachers to do it and that’s generally considered more important than accurate data.

Without data, how do we know whether your school's maths teaching is noticeably better or worse than school across town?

Do you know that with the data? Maybe my school’s kids hire tutors on a grand scale and the school across the town has a poorer intake who can’t afford to. We know socio-economic status and attainment are strongly linked. How can you extract the effects of teaching from the data as presented?

Report
TeenTimesTwo · 15/09/2018 10:49

But isn't the data a starting point?

Say you both get 75% grade 4+ at GCSE.
But I know your school is in the leafier area. That would make me question why your results weren't 'better' compared with the other school?

You can't use the stats to say a school with lower results is de facto less good. But without data aren't we going back to 'trust us, we're wonderful'?

Report
TeenTimesTwo · 15/09/2018 10:55

At work, we had something called a 'business balanced scorecard' for development projects. A set of information you had to look at together. Was the project to schedule? To time? What was the level of risk? And some other things I forget about. The point was you couldn't just look at 1 aspect, you had to judge how they were going across everything.

Isn't that like the data we now get? Previous high/middle/low achievers. FSM/PP pupils, results in various things, similar schools etc?

Report
Please create an account

To comment on this thread you need to create a Mumsnet account.