OK, I admit I am curious, and I also felt like a relaxing 30 minutes with an Excel Worksheet to find out which are the weakest 200 primary schools in the country.
Michael Gove announced in his speech to the National College for School Leadership on Thursday, 16 June 2011 that theses schools will be converted to academies in 2012/13 because, to use the Secretary of State’s words, rather than those of the DfE News, the schools are those that “have most consistently underperformed”. This follows the announcement in the November 2010 Schools White Paper The Importance of Teaching (paragraph 23 of the Executive Summary) that the Government will
“Ensure that schools below the floor standard receive support, and ensure that those which are seriously failing, or unable to improve their results, are transformed through conversion to Academy status.”
The DfE website has a transparency site which digging down leads to a claim that access to the data underlying statistical releases publishes by the Department since July 2010 is available. And the claim that the December 2010 publication Key Stage 2 Attainment by Pupil Characteristics, in England 2009/10 contains the underlying data. This is not the case. School level performance data is not found here as part of the National Statistics as they are not “produced to high professional standards set out in the National Statistics Code of Practice”. Apparently, aggregating pupil level data to local authority level does meet the high professional standards.
Hunting round the DfE website I eventually found a KS2 underlying data website. This contains the standard information sent to journalists prior to the annual publication of KS2 results in December with a footnote that legitimate researchers can ask for more. As I was not looking for the anonymised performance of each child on each question in the May 2010 KS2 SATs (which is stored nationally) but the English and Mathematics results aggregated to school level, I downloaded the Excel file and had KS2 results of 14812 schools (excluding special schools whose results are used to determine local authority performance). It is sobering to consider that this is made up of 13337 schools in 2008, 13457 in 2009 and 9815 in 2010. The lower 2010 results can be explained by the teachers’ boycott, and a few schools have their results excluded each year because of excessive in helping children during the tests, but the rest is mainly due to the normal process of opening and closing schools particularly schools where there is a build-up to a year 6 class, or very small schools where there are no year 6 pupils.
The next task is to find the criteria used to select schools. The Secretary of State announced in the Schools White Paper (paragraph 6.26) that
“For primary schools, a school will be below the floor if fewer than 60 per cent of pupils achieve the ‘basics’ standard of level four in both English and mathematics and fewer pupils than average make the expected levels of progress between key stage one and key stage two.”
This is an increase in the floor standard which was set at 55% in July 2008 (it had previously been 65%) and the proportion of pupils gaining level four in both English and mathematics in the KS2 assessments has long been a statistic published at school level. So it is easy to find which schools have reached the floor (albeit there will be many arguments when one additional non-English speaking pupil on roll should be removed from the denominator thus pushing the school over the floor).
At least in his NCSL speech the Secretary of State recognises the problem of small schools “Of course primary test scores are more volatile than those in secondaries due to the smaller size of schools, so one has to treat data with additional care”. Schools with 10 or fewer pupils do not have their results reported. There were 160 schools with 11 pupils. One pupil dropping from a level 4 to a level 3 in English or mathematics will cause a 9 percentage point drop in the school’s performance.
The relationship of the floor to the expected level of progress between key stage one and key stage is difficult to work out from the published information. The expected level of progress is two levels, for example level 1 to level 3. And the Excel spreadsheet gives separately the percentage of pupils achieving 2 or more levels of progress in English and mathematics, and also the ‘coverage’; inevitably schools in areas of high mobility into the country will not have a record of a pupil’s KS1 record. In a few schools the progress measure is the result of two-thirds of the pupils taking the KS2 SATS. And how is the average level of progress defined? I am not sure. The National Statistics publication quotes the figure for 2010 as 84% of pupils nationally making two or more levels of progress in English and 83% in mathematics with a footnote that the method of calculating these figures has been changed to bring consistency with the figure in the Performance tables. Confusingly, the information accompany the Performance tables quotes a median figure (English 87% and mathematics 86%). Presumably, the median is of school level performance, which I would expect to be higher than an achievement figure of all pupils as small schools tend to do better mainly because small schools tend to be in more affluent areas.
The letter to local authorities on 1 March announcing the Government policy on Improving Underperforming Schools states that “where the children make better than average progress between Key Stages 1 and 2” the school “will be exempt from falling below the floor”
In my selection of the weakest schools, I have included schools which have been below the 60% floor target in 2008, 2009 and 2010 but excluded schools where progress is equal to or above the national performance for progress in English or Mathematics (or both). This leaves 259 schools. The Secretary of State’s list is for schools which have been under the floor target for five or more years. These data are not published.
Time prevents me from using Excel’s Pivot Tables to look at the characteristics of these schools, particularly geographical location, SEN pupils, attendance etc. As it happens, I know a governor of one of the schools listed. It is an average-sized inner city school well-supported by its local community hosting a large SEN unit (whose pupils contribute to the school’s KS2 performance), and a recent ‘satisfactory’ OfSTED rating. I spoke to the governor yesterday, and if the school is on Mr Gove’s list, governors are yet to be informed.
Meanwhile, there are 200 primary head teachers anxiously looking at their in boxes to see whether they have been labelled as one of the weakest 200 primary schools in the country, and the likely prospect of being named and shamed.
Clearly, Michael Gove has learnt from Ed Balls’ fairly disastrous naming of secondary schools below the then floor target in Summer 2008. It did enormous damage to some schools, especially in the eyes of prospective parents, but did lead to significant help from the National Strategies in both personnel support and resources. It is too early to judge whether this policy was a success, but my hunch is that ten years from now it will be seen as a courageous thing to do but the media handling could have been much better. Certainly, the support the named schools and their local authorities received was excellent. This support structure has now gone, and unlike three years ago where a variety of different forms of solutions were proposed with additional resources, all these primary schools are being offered is a single solution: being chained into an Academy Trust, or as the DfE Press Notice puts it: “The rapid conversion of so many great schools to academies means there is now a larger pool of great schools to build chains and improve underperforming schools”. Given that most of the established chains, if they have relevant experience it will be with secondary schools, it does not bode well for these schools.
The strategy is justified because the Education Secretary has said that “Evidence shows that the academy programme has had a good effect on school standards” quoting a non-peer reviewed study by a couple of academies from the LSE. There was a time when senior officials from the education department would boast that they would never let an idea outside until it had been tested to destruction. Sadly, that was a long time ago. It clearly also shows the Michael Gove’s dislike of the Government’s localism agenda and his desire to centralise decision making in Sanctuary Buildings.
There is one further difference between this exercise and Ed Balls’ one. Michael Gove justifies his decision to abolish five education quangos created by statute and another half-dozen other bodies on the grounds that he wishes to be accountable directly to Parliament for the responsibilities of these bodies. Ed Balls took personal responsibility through the National Strategies executive agency. Michael Gove is divesting himself of responsibility by giving responsibility for the improvement of these schools to private bodies: Academy chains. Admittedly he can dismiss these bodies almost at the stroke of a pen (in private because of commercial confidentiality). Is this was greater transparency is about?