Advanced search

Check the data for your childs school - find the hidden truth

(57 Posts)
riddlesgalore Wed 27-Feb-13 17:48:35

You might like to ask questions based on DFE performance data:

Particularly the average point scores percentage of GCSE only entries against all (including equivalents).

Also Ofsted's Data Dashboard:

You may find it interesting to compare schools against the bench marks you think important. Also remember to check the courses entered at GCSE and A Level to ascertain if the academy/school is guilty of any league table manipulation at the expense of attaining pupils full potential.

I think you will find it very interesting.

tiredaftertwo Thu 28-Feb-13 18:20:07

And it specifically recommends governors get access to the RAISEonline data and explains that the tables are based on the league table data set, and driven by it. And that they are simple accessible tables for everyone to use

How often would you expect data about annual exam performance to be updated BBB? I don't see how it could be more than annually.

BigBoobiedBertha Thu 28-Feb-13 19:19:52

It can't be more than annually obviously but having just been reading how Ofsted expect governors to know what it says and for this to be evidence that they are doing their job properly and monitoring the performance of the school, is just silly. And if it doesn't work to inform governors about how a school is changing, it won't work for parents either. Perhaps you don't think parents need the information but that is far more important to me that a set of test results. I am not interested in what results last year's cohort got, unless my child was one of them. I am not even that interested in what similar schools did. I want to know that a school is adding value and how much value it has added. This seems to me to be a far more signficant statistic. The Dashboard doesn't have much, if anything, to add to what the league tables already say on that.

You know how teachers always complain about how some parents ask about how the rest of the class are doing when they should just focus on how their child is doing. Well this is the same thing, scaled up. As a parent I am not interested what others schools are doing. I am interest in how my child's school is doing and how much improvement my child can expect to make there.

thesnootyfox Thu 28-Feb-13 19:41:50

Thanks. The second table is very useful.

tiredaftertwo Thu 28-Feb-13 20:38:24

BBB, the FAQs and other info make clear that:

1. This is not for governors primarily - they are specifically told to seek out other more detailed data in the FAQs. If ofsted choose to assess on that, that is up to them, and presumably as a baseline measure but yes most govs will know an awful lot more. Where di you read that?

2. This is a subset of the league table data, in a simpler form. So it adds nothing new. Similar schools provides context - how schools with extremely similar intakes in terms of attainment perform but if you are not interested in that, that's fine you don't have to read it.

As a parent and taxpayer, and someone who would like to see educational inequality addressed, I am very interested if schools with similar intakes have wildly differing results.

No-one has to read these. And yes they are very simple and not nuanced.

AChickenCalledKorma Thu 28-Feb-13 21:35:55

"May I suggest you read the dashboard guidance document first. As BBBerthasays, it's all in the small print."

That is precisely the point. It's all in the small print. Which most people won't bother to read, because they will look at the "dashboard" and say "oh, that school's cr@p" and not bother to look any further. I fully support parents having access to information, but I don't support it being presented in such a simplistic manner. I think the current league table information is very good, because you can actually look at the information behind the headlines and get an understanding of how schools compare in terms of intake etc. This goes completely the other way and simplifies things to the point of being misleading.

Of course I am speaking specifically from the point of view of a school that is doing all it possibly can to do a great job in a "difficult" area - surrounded by leafy suburbs where the other nearest schools have a much easier time of it. And I'm a bit fed up of the tumbleweed moments when other parents make snap judgements about me, based on completely misconceived ideas about the school my children attend. (Where, incidentally, they are doing very well, happy etc etc etc)

tiredaftertwo Thu 28-Feb-13 21:56:00

I understand how you feel AChickenCalledKorma. So how does your school do on the "similar schools" measure and on progress, which are not measuring raw attainment? I am genuinely interested because I think schools in unusual situations can come badly out of tables like these, but from what you have said it should be doing very well on both those measures? The similar schools measure is specifically about intake - so it is in there (that is on attainment - narrowing the gap is about disadvantage)

I can see the argument for not providing any data - though I don't agree with it. But surely once you start down this route, the more the better, and people will learn to interpret it. No table will be perfect. But the more data, the fewer perverse incentives to go for table points not child's interests, it seems to me.

And actually, when you look at exam results (including against schools with very similar intakes) narrowing the gap, and progress, quite a lot of detail and context is provided for something that is so easy to read. I think.

When I first looked at these earlier I thought OK, pretty simplistic but OK> But actually there is more in them than that, when you look at all the reports, and the notes then also point you to the full league tables. I think it's not a bad effort - if they could do better, let's tell them how.

AChickenCalledKorma Thu 28-Feb-13 22:53:46

tiredaftertwo It does do better on the "similar schools" measure - except in maths. And the thing is, I know that there is an issue with progress in maths. But the problem is focused on a specific group of children - the majority of pupils are doing absolutely fine.

BigBoobiedBertha Thu 28-Feb-13 22:53:57

tiredaftertwo - I read it on the Ofsted website {[ - this here]] Apparently some governors don't even know the data that is in the dashboard. How you can be a governor without knowing as the barest minimum, what is contained in the Dashboard I don't know. If happens though.

You'll have to forgive me a little - I am looking at this as a governor on one thread and as a parent on another. As a parent, I have exactly the same concerns as AChickenCalledKorma about the bare statistics, especially oversimplified ones, not giving a true picture of attainment which the dashboard doesn't. It make no clear statement about value added and without it, the rest is just reporting history really.

And trying to compare schools still feels to me to be equivilent of sneaking a look at another child's book bag to see what reading level they are on and then being one of those parents who thinks that their child should be doing as well the other child. After all, the children have the same education, the same teachers so we can compare them can't we? Only we can't because they aren't really the same when it comes down to it so comparing one school with another based on pure statistics is so overly simplistic as to be pointless for anything than broadest of overviews. It tells me nothing about how my child could or should be doing in the school he is in or might go to.

BigBoobiedBertha Thu 28-Feb-13 22:56:48

Grrr - bracket malfunction blush

This is the real link - this here

montmartre Thu 28-Feb-13 23:25:51

FFS- what is this thread? The infiltration of MN by DFE wonks? hmm

None of the information available in the dashboard is newly available, and they are a very basic representation of summary data.

As a parent, what I am interested in for (secondary schools) are:

Academic standards (including value added)
Destinations of former pupils (by type, and institution)
Pastoral support/support for those with additional needs

How are the DfE going to measure (ie quantify) the quality of pastoral support?
When will schools publish destinations for all pupils? (Independent schools do!)
How do you explain that Pate's Grammar which got 100 % A*-C in Science is classified in the 3rd quintile for similar schools? hmm

And wow- 'In Your Area' - schools information for 2010, how informative. 60% of schools in my area have gone academy since then, so really up-to-date there... AND the KS4 performance data links on there are broken! <<rant over>>

tiredaftertwo Thu 28-Feb-13 23:29:41

BBB, thanks, I agree, this is the minimum, and that is what the link implies too. Wilshaw is saying governors should know their schools well, and for those that don't there is now no excuse. It does not say this is a tool for performance management, on its own. It says it is a snapshot. I too am appalled if there are govs who don't know where else to get this data - and more. But if there are, then perhaps it is better provided in this way, with the recommendation to get into RAISEonline very clearly stated on the FAQs. If governors don;t find it useful, as I hope they won't because they have access to more detailed and up to date info, then fine they don't have to use it.

I don't understand your other points, The dashboard includes data on progress, and contextual data - not masses, but it is there and it is interesting. Can you point to what you mean?I agree they tell you nothing about how your child is doing - they are not designed to. If you are happy with the information you have, then don't read them. They say they are a snapshot and refer you to more detailed information sources. I don't get the problem, except they are treating parents as intelligent adults who can be trusted.

I can look up schools in my area, very close geographically and with similar intakes and within a few minutes see that they have very different results. So can journalists, researchers, analysts, all of us.

AChickenCalledKorma - that sounds like it gives a reasonably accurate picture then? If the result compared with similar schools is significantly lower in maths, then the group of pupils affected cannot be insignificant? And the results in other subjects are good compared with similar schools, so that's good, and accurate too.

tiredaftertwo Thu 28-Feb-13 23:46:10

Montmarte, yep a basic summary of already available data. That's what they say they are - a simple snapshot - and it is what they seem to be. What I don't understand is why this makes people angry. They are not perfect but they may be useful for some purposes. And most importantly, these and the league tables do allow us to start to look at where inequality arises and how - does no-one else think this is important? Schools can no longer say our results are poor because of our intake, if schools with very similar intakes are doing much, much better. Schools with very able intakes can no longer coast if schools with similarly able kids are doing much much better. Good. Everyone can have the info, not just those who to ask and how.

Leavers' destinations-I agree but think it would have to be on schools' websites as I can't see how it could be standardised.

I don't see how you can measure pastoral care in this way - that doesn't mean it is not important.

I suspect with Pate's that it is because all those schools will get 100% - perhaps you shoudl ask oftsed?

montmartre Thu 28-Feb-13 23:50:53

tired- I am not angry about the data! I am cross and cynical about the OP, and their motivation for fawning posting this.

montmartre Thu 28-Feb-13 23:54:13

Value-added information has been available for years though- I don't think schools have been able to claim poor results due to intake for a very long time.

It doesn't address the inequalities in the education system though- if anything magnifies them, as now parents with choice know which schools to avoid!

tiredaftertwo Fri 01-Mar-13 00:14:00

Absolutely, yes it has, but the current league tables include a lot more information than they did at first. Within a couple of miles of my house, there is one school where the high attainers get A- average and another where they get C+ average. I'd at least want to ask why if I were considering them wouldn't you? There may be a good reason - but they can no longer claim they don't have high attainers.

Ah, there's the nub about inequality isn't it? I didn't say they address it as in solve it, but this sort of data (actually the more detailed league tables) will let us start to identify where and how it arises - because it exists. And then perhaps we can do something to fix it. warwick1 is right about what has been revealed.

Yes, another solution would be to conceal all information so no-one except a tiny elite can make an informed choice, and so hide all failure (I am sure you didn't mean that smile).

OK, other posters have sounded cross about the dashboard.

I know I probably sound a bit madly enthusiastic about all this, but think we will be getting more and more information thrown at us, because it is now so easy and governments can claim they are transparent then, because they have dumped a load of tables and spreadsheets (often about other bodies) in the public domain. I think that could be quite cynical. But to hold them, and the public bodies whose performance is being captured to account, we will have to grapple with it I think.

BigBoobiedBertha Fri 01-Mar-13 03:22:07

But who are you going to ask all these questions of tiredaftertwo? What do you want to know all this stuff for? You talk about addressing inequalities like you have the power to do that. I would suggest that unless you are planning on getting elected into local government you don't have any power at all. Are you going to go to each school and ask the same questions? Do you think you will get a decent (or honest come to that) answer from them. Do you not think you will get a sanitised version of what the results mean that fits the picture they want to create? I really can't see that throwing data into the public arena serves any purpose than to either tie people up in knots until they throw their hands up in despair and simply not bother to look at it at all or perhaps worse, convince them they know more than they do. Its all smoke and mirrors. You can take whatever you want from the stats but I will guarantee that there will be somebody somewhere who, when looking at the same data as you, will get the completely opposite idea.

Really, I don't see what the Dashboard adds to what we already have and it is just one big money wasting exercise to see how many ways we can produce the same data and convince people that we are getting something new. We aren't. I'm not angry about it, just deeply cynical and also slightly surprised that the OP seems to wax so lyrical about it. Makes me wonder if she is one of the Government's minions.

Also I still can't see what benefit there is in comparing schools like this. The dashboard for the 2 schools I have looked at (my DSs' schools) don't tell you who they are being compared with. I hope you realise that schools that are neighbours aren't necessarily those that any given school is being compared to. 'Similar schools' isn't just about the location. DS2's school for example, is compared to 1 of its neighbours but not the other 2. It is also compared to a school over the other side of town in a village whilst we are in a suburb. On the face it very different schools. In your example of 2 schools with supposedly high attainers getting different levels of attainment, it is quite possible that they aren't in the same comparison group at all. (I don't get your point there by the way. Of course the school with the lower attainment can claim it doesn't have high attainer. It doesn't have them does it if they are only averaging C+? confused)

You can analyse the data until the cows come home but you will have see enough stories on MN, if you care to look, of parents who get their child into a school which looks good on paper but which isn't actually what they hope for or wanted at all. Others who have sent their child to a school with a poor set of stats who find their children are thriving. For parents, which is what we are talking about isn't it, not political activists or educational analysts, there is no substitute for going into schools and talking to people and getting a feel for the place. Sifting for data is of limited use and league tables aren't the be all and end all of chosing a school.

BoundandRebound Fri 01-Mar-13 06:39:26

I think those who are interested in educational data should be considering and feeding back on the government consultation paper on how this is all going to develop.

In my opinion a large grain of salt is required when considering league tables which are by their very nature, deeply flawed. Although we appear to be moving in the right direction with this. Value added for example incorporates Ebacc as part of its calculation and Ebacc itself is unhelpful at best.

I particularly like the Best 8 and the move towards breadth and depth and putting the children rather than the league tables first.

BoundandRebound Fri 01-Mar-13 06:40:14

This consultation was somewhat hidden as it was published with little fanfare on the same day that Gove did his huge backdown

tiredaftertwo Fri 01-Mar-13 07:34:12


Your first point - well I suppose I am interested in policy development and do have a lingering belief that democratic processes occasionally improve situations. I am politically active, I do respond to consultations and I follow and belong to various groups that think about this stuff. Horses for courses, fine if you don't agree or are not interested. I think information is power. And it makes me furious when people say we have to pay for and entrust our children to services but cannot be allowed to see basic outcome measures.

Yes, stats can be misinterpreted. No, you "cannot take what you want from them". That just means you haven't understood them. All information can be misunderstood. But it is better than rumour.

Why do you say it is a big money wasting exercise? The whole point is this is technology driven. The govt collects all this data anyway - and it is easily disseminated in different forms. There are endless ways to display the same data set. This is a new way (for the same data). Try downloading a data set and using excel to draw some simple graphs - it takes a few minutes. Even websites are not very expensive nowadays.

The FAQs explain very clearly that the comparison for similar schools is with schools with very similar attainment levels on entry. Of course it isn't geographic. But as I have explained it does show up where schools are showing good results but actually not doing well by their pupils. Parents and others can make geographical comparisons if they wish.

Read my last post. The school I mentioned is averaging C+ for its group of high attainers, so of course it has some. And at first glance they are not doing well.

If you are choosing a school there is no substitute for looking at it. And of course there are many other things than exam performance that will determine your child's happiness. No one is saying otherwise. Most of this thread has been people arguing against points that have not been made.

Nope, we are not just talking about parents. Or I am not. The whole point is that there is now all this data (if you wanted to do a local story say, you might use the dashboard for some bottom lines for speed and then use the league tables to go into more depth) and it will be subject to all kinds of analyses from different groups. Hence my reference earlier to the BBC and other media stories about the RG subjects. But I am a parent and I am interested in this - I won't be alone.

If I were choosing which school to put top of my form, and in one 80% made expected progress and in the other 20%, then yes that would be something I would look into and perhaps ask the school about, and consider alongside everything else.

I think I've bored everyone enough! To me, if they are useful to you fine, if not, that's fine too. They do not seem to be to be inherently misleading and no-one has produced a real example of where they are (I understand people don't want to identify where they live). Other criticisms accuse them of not being the be all and end all, not being new, not being nuanced. None of which they claim to be and they point you to further info. So I will stop answering now.

Bound and Rebound - thanks for that but the link doesn't work. I totally agree with you about the best 8 (that is an incentive in the right direction - fewer good grades!) and putting children first. I think the more data the tables show, the less chance there is of schools being under pressure to find some obscure way to game the system at children's expense.

Can we all agree that seeing Gove back down was one of the most enjoyable political moments recently?

BigBoobiedBertha Fri 01-Mar-13 11:10:46

I still have no idea what you mean about the high attainers averaging C+. I have read and read your post but it makes no sense to me. Maybe my fault but what are you trying to say?

The dashboard to me is far too basic and just a repetition of what is produced elsewhere, that is why I shall be ignoring it, not because I can't get my little head around it as you seem to be suggesting. I can get all that information somewhere else and more and have been able to for some time. Knowledge is indeed power but since this doesn't add anything to the knowledge already available I fail to see how it is powerful. Producing another way of presenting the data isn't without cost, as you seem to think. Given the fanfare with which this was 'launched' time, effort and money has gone into it (I refer you back to the Ofsted website to see the ho-ha they are making about the 'launch'). It has probably been somebody's pet project for months if not years. It is naive to think otherwise. Yes the data is already being produced by the schools and collated by the LEA. You only have to look at what schools have to produce now, compared with even a few years ago to know that this is not without cost, in both time and money.

BigBoobiedBertha Fri 01-Mar-13 11:20:58

You wanted an example of where the data is misleading. Here is an example

I was going to copy it out and palm it off as my own but decided against it. wink

He says it so much better than I can anyway.

Lomaamina Fri 01-Mar-13 11:22:05

Thanks very much for this riddlesgalore. I looked at my DS' school's dashboard and discovered an interesting [sic] explanation for why they're pushing them on the English GCSE at the moment - it's underperformed in this subject (and not in Maths, curiously] in the last couple of years compared with similar schools. I haven't seen such analysis before.

tiredaftertwo Fri 01-Mar-13 12:28:03

Hi BBB, just popping back

Thanks for that link - that is really interesting and makes a good point. Let's hope they change it - the league table measures do change over time. Remember before they included any value added and there was no distinction between types of qualifications and subjects? That wwas terribly misleading I think. But the definition of expected progress is not actually about the dashboard - it happens to be one of the data bits they have chosen to include from the larger set.

On the league tables, you can look at the average GCSE grade for different groups of attainers (ie their attainment at entry, sorry I see the confusion now). In this school, the average grade for the high attaining group (on entry) was C+ at GCSE.

On the data front, collecting data and assembling it in different ways is very costly, I agree. Disseminating it in different graphs, much less so. I am touched by your faith that if a public body claims it has done something significant, it has, but I don't share it! All they have done is extract some data from a larger set and presented it in a different and perhaps more accessible way. But presenting data in different forms can make it useful for different audiences. If you don't find it useful, that's fine, don't use it.

Several people on this thread have said they have already found useful information in it.

BigBoobiedBertha Sat 02-Mar-13 18:00:23

And I am equally touched by your belief that governments are efficient and will not have spent any money getting all this data together even if it is already available in another format. (Perhaps you have forgotten we are talking about Goverment bodies here not rational human beings?) They have schools and other institutions filling in the same data 52 different ways rather have them do it once and then manipulating the data themselves. I have seen some of the returns and documents schools have to complete. How many ways can you say the same thing but on a different sheet of paper? It will have costed. Maybe I am an old cynic but I am more likely a realist.

BoundandRebound Sun 03-Mar-13 18:38:08

Try this link

Join the discussion

Join the discussion

Registering is free, easy, and means you can join in the discussion, get discounts, win prizes and lots more.

Register now