Meet the Other Phone. Child-safe in minutes.

Meet the Other Phone.
Child-safe in minutes.

Buy now

Please or to access all these features

Scotsnet

Welcome to Scotsnet - discuss all aspects of life in Scotland, including relocating, schools and local areas.

time for competition for the SQA?

16 replies

SockYarn · 11/08/2020 20:05

Part of the reason for this shitshow is that there is no competition over who sets exams. All state schools in Scotland, and the private ones which do Highers (most of them) must use the SQA. It's been that way since at least the 80s when I was doing my exams, probably earlier. No accountability. No incentive to do things better as you have guaranteed "customers" each year.

So why don't the Scottish Government open things up a bit? Either allow someone else to set up, for example,, The Caledonian Exam Board? Same curriculum, different exams. Schools opt for one or the other. SQA continues to do a shit job, they lose customers. And money. And Swinney's brother and Nicola's school mate are out on their ears.

Or even it doesn't have to be that complicated. They allow AQA, or EdExcel, or whatever other exam boards are already up and running in other parts of the UK to start setting exams in Scotland too.

When you have one organisation operating a monopoly there is NO incentive to improve. Shake things up a bit. Give headteachers the power to decide what's best for their schools and students .

OP posts:
Lidlfix · 11/08/2020 20:21

Interesting thought, there are companies such as P &N and Perfect Papers which provide prelims.

SockYarn · 11/08/2020 20:35

The "down south" exam boards have the rest of the infrastructure too - ways to recruit markers, quality control, printing certificates.

Won't happen though.

OP posts:
Scotslassie1 · 11/08/2020 20:47

But the same statistical moderation programme was used in England and Wales? I think we'd need to look further afield.

SockYarn · 11/08/2020 21:00

Not just the exam fiasco this year - everyone has a horror story to tell about the SQA. They are so shit because they have no incentive to be better.

OP posts:
wigglybeezer · 12/08/2020 11:21

I think its too small a market to get anyone excited about taking it on do you not think? You'd just get a lot of former SQA people taking redundancy and setting something up...out of the frying pan into the fire.
What do they have in Ireland for instance, I don't think they have private companies involved.
Certainly don't want anything that would lead to A-levels...
I dont think the SQA are the main ones at fault here, in a data driven environment they protected the data. Its the politicians who should have asked for possible outcomes and had the bravery to nix the whole thing earlier when they realised the effects what would likely occur. Not that i think the SQA are marvellous.

Dissimilitude · 12/08/2020 11:29

The problem here wasn't the SQA. The moderation process was entirely sensible!

www.bbc.co.uk/news/uk-scotland-53580888

  • teachers had to decide grades based on past work, prelims etc.
  • teachers then had to rank pupils within each subject.
  • SQA then calculated maximum and minimum pass rates based on, yes, historical standards at the schools via a 4-year average, with adjustments to account for differences in variation between subject (e.g. English is easier to predict because there's huge data on previous sittings, Gaelic, less so since hardly anyone takes it).
  • where teacher estimates were significantly outside historical norms, intra-pupil rankings were used to adjust grades to fit them back within bands.

Crucially, now that we have jettisoned this final step in the name of political expediency, and relied entirely on teacher judgement, there is zero calibration between schools.

Meaning the burden of unfairness now sits squarely on those pupils whose teachers didn't tear the arse out their estimates.

Dissimilitude · 12/08/2020 11:40

In other words, if you were adjusted down in a subject, it was because your teacher submitted grades which were wildly outside of historical norms AND the teacher ranked you as one of the weaker pupils in a given subject in the class.

Now we have a set of results which are 20% better than any set of results, ever, in a year where there was less teaching 😂

WaxOnFeckOff · 12/08/2020 12:11

But that whole plan is inherently flawed as there was no cross checking to see if anything that appeared out of the norm was due to overestimating or not, no QA no data analysis on individual students. It's all good to say that on a national level the results are correct, but this takes no account of actual individuals. It doesn't account for work having been undertaken in the school on various subjects, doesn't account for statistical variability on small sample numbers etc etc etc.

It was shit and if it was down to teachers, a bit of actual checking could have identified if that was the case.

Poor poor management, poor criteria and shite outcome that could have been avoided with some proper work.

Work for the SQA do you @Dissimilitude?

SockYarn · 12/08/2020 12:17

This isn't just about this year's exams though. In general, when you have one company controlling a market with a complete monopoly, there is no incentive to improve, innovate and get better at what you do. Changing that situation might result in better service for all schools, teachers and students.

OP posts:
WaxOnFeckOff · 12/08/2020 12:26

That perfectly describes the SNP too.

Can't believe people still defending all this just because they said sorry when forced to. It was the fear of losing the votes from those they have so carefully tried to brainwash since they started school that was the tipping point, nothing to do with being "fair".

NS said to judge her on education so the buck stops there.

Dissimilitude · 12/08/2020 12:51

@WaxOnFeckOff

But that whole plan is inherently flawed as there was no cross checking to see if anything that appeared out of the norm was due to overestimating or not, no QA no data analysis on individual students. It's all good to say that on a national level the results are correct, but this takes no account of actual individuals. It doesn't account for work having been undertaken in the school on various subjects, doesn't account for statistical variability on small sample numbers etc etc etc.

It was shit and if it was down to teachers, a bit of actual checking could have identified if that was the case.

Poor poor management, poor criteria and shite outcome that could have been avoided with some proper work.

Work for the SQA do you @Dissimilitude?

Obviously an integral component of the moderation system would have been sample testing to help audit the quality of teacher submissions, and a very robust and quick process to validate estimates that were completely out of line.

I am assuming that at least some kind of audit-driven validation of estimates was in place, and it wasn't a purely statistical exercise. Pretty scandalous if not, I agree.

I would like to see exactly what the process was here, because the devil is in the detail.

Either way, I think the SNP have been utterly shocking on this. What can't be disputed is that the current situation is a fiasco, and that by correcting the injustice of one group, they have applied an injustice to another, less vocal group (the set of people whose teachers estimated modestly, who are now relatively worse off in rankings). The credibility of this years results is also in tatters.

Dissimilitude · 12/08/2020 12:55

I also think that it is simply obvious, statistically, that rampant over-estimation has occurred, even if we have to throw our hands in the air and say that because we can't tell exactly who did and didn't do it, we have to let it all go.

Which may be the only solution, given where we are, but they've sacrificed the integrity of the system.

Dissimilitude · 12/08/2020 13:00

"doesn't account for statistical variability on small sample numbers etc etc etc."

Sorry for repeated posts, but they actually did account for this, I believe. Wider margins of error were permitted on subjects with smaller numbers of pupils, to account for the greater variability of smaller data sets.

WaxOnFeckOff · 12/08/2020 13:02

I understand that no sampling of evidence was undertaken and isn't detailed in the information given of the process so was never an expectation either.

I've posted on another thread, but a large part of the difference in figures is unlikely to be due to mass overestimating. If you take away the stress of the exam and all the factors that could arise on the day, then you already have a statistical significant change. Then add on all the borderline pupils that would be given the benefit of the doubt, those that perform say at 48/49/50% for an exam with 50 as a pass mark. A number of those are going to miss by one mark in the actual exam but it would be pretty hard hearted for a teacher to say that they only achieved 50 a quarter of the time so I'll mark them as a fail.

So, not wildly giving solid C students an A. I've no doubt there will be specific incidents where that might be the case, but proper data analysis, and cross checking would have gone a long way to identify those cases.

Dissimilitude · 12/08/2020 13:22

If no sampling was done, then that's genuinely terrible (and stupid), and blame sits on the SQA, as well as the government for tolerating it.

"but a large part of the difference in figures is unlikely to be due to mass overestimating. If you take away the stress of the exam and all the factors that could arise on the day, then you already have a statistical significant change."

I think that's very, very debatable. Plenty of students actually perform better in exams than they do in coursework (there's a well known gender gap in coursework vs exam attainment, for example). So I don't think it's a given.

But I take your point on the sampling. I await more detail here, because I find that inexcusable.

WaxOnFeckOff · 12/08/2020 13:29

I think that's very, very debatable. Plenty of students actually perform better in exams than they do in coursework (there's a well known gender gap in coursework vs exam attainment, for example). So I don't think it's a given.

yes they do and there are a number of grade increases based on what they did, but teachers weren't only marking based on coursework, they were using performance in prelims and class tests and previous performance in Nat 5 for those doing Higher, so much of those would be included at an appropriate grade. My DC are those that pull it out in an exam as compared to coursework so that wouldn't have suited them at all but teachers don't have crystal balls to be able to mark people up with no evidence.

I think the largest part of the discrepancy will be those borderline cases to be honest.

And I have yet to see any claim that any sort of actual moderation and looking at evidence was done. I therefore conclude that none was done or they'd be shouting about it.

New posts on this thread. Refresh page
Swipe left for the next trending thread