Learning from The Schools Guide
30th November 2021 by Timo Hannay [link]
A few days ago we launched The Schools Guide, the latest iteration of a project we originally announced in February to put better information in the hands of families and prospective staff who are assessing their school options. The response to the first version was tremendously positive – thank you to everyone who wrote in, participated in our user research, or just made use of the site. The latest version incorporates many of the excellent suggestions we received, such as:
- Information about primary (as well as secondary) schools
- The ability to search by place name or postcode (as well as by school)
- Personalised lists and rankings of local schools
- Additional information about aspects such as GCSE and A-level subject range, demographic mix, post-school destinations and provision for special education needs
- And, inevitably, lots lots more
For those who want to simply go ahead and use the site, there's a Getting Started Guide and a video (below).
But for those who wants to dig into the data a bit first: you're our kind of people, so read on. We have some interesting insights for you, including:
- Correlations between indicators are generally very weak, which suggests that almost all schools have both strengths and weaknesses; whether or not they are 'good' depends in part on personal priorities
- Competition for places at any given school is a poor indicator of educational effectiveness. This is especially true for primary schools, where many objectively high-performing schools are also relatively easy to enter
- Regions, as well as schools, have distinctive strengths. For example, London has the best academic performance, but the worst environment; schools in the South West show less segregation but weaker finances and sixth forms; the North East is great for primary education, but has lower-performing secondary schools.
An introduction to indicators
For secondary schools, The Schools Guide uses ten indicators to reflect a range of characteristics. We have previously looked at the relationships between these and concluded, among other things, that correlations between measures of educational effectiveness (especially 'Progress') and ease of entry (ie, 'Admissions') are rather weak. This contradicts the common assumption that all high-performing schools are difficult to enter – or, conversely, that all schools at which it's relatively easy to secure a place must therefore be low-performing.
With the addition of primary schools to the latest version of The Schools Guide, we can ask whether same is true there. The short answer is yes, only more so.
For primary schools, there are eight indicators (since the 'Destinations' and 'Sixth form' indicators used for secondary schools don't apply here). They are:
- Admissions: How readily are places available?
- Attainment: How good are test grades at age 11?
- Attendance: How reliably do pupils show up at school?
- Disadvantaged pupils: What are the outcomes for poorer children?
- Environment: How safe and healthy is the neighbourhood of the school?
- Finances: Does the school appear to be on a sustainable financial footing?
- Progress: How much academic progress do pupils make between the ages of 7 and 11?
- Representation: How reflective is the school of its local community?
(Further details are available in our previous post and on The Schools Guide FAQ page.)
Correlation or no relation?
Figure 1 is a correlation matrix for all eight of these indicators, showing how they relate to one another statistically. Positive correlations are in blue and negative ones in red, while the size and intensity of the dots correspond to their strength. (The blue diagonal is simply a consequence of the fact that each indicator correlates perfectly with itself.)
As previously seen for secondary schools, correlations with the 'Environment', 'Finances' and 'Representation' indicators are almost non-existent, suggesting that they are independent of each other – and indeed almost all the characteristics listed.
Beyond this, 'Admissions' correlates negatively with 'Attainment' and 'Attendance' – which is to be expected if schools with higher attainment and better attendance tend to be more popular. But these effects are weak, with correlation coefficients of -0.36 and -0.34, respectively. The relationship with 'Progress' – a much better indicator of educational effectiveness than 'Attainment' – is weaker still (-0.25).
Figure 1: Correlation matrix of indicators for primary schools
This suggests that families aren't putting a lot of weight on academic performance – or anything else that's readily measurable – when selecting primary schools. Perhaps this is because they care more about ineffable aspects such as school ethos, or because they tend to follow the herd to schools that (for no readily identifiable reason) are already popular, or because many simply plump for the nearest school that doesn't seem terrible. Whatever the reason, one consequence is that many primary schools that rate highly on these indicators are also relatively easy to enter. We created The Schools Guide precisely to help families identify these kinds of schools.
To see this more clearly, consider Figure 2, below, which shows how individual schools are distributed with respect to the different indicators. Looking at 'Admissions' against 'Attainment' or 'Progress' for the 'National sample' (a random selection of about 10% of all primary schools, just for ease of display), you might be able to see that they correlate a bit (the top-left and bottom-right corners have more dots than the bottom-left and top-right), but the points are scattered far and wide, which means that at the level of an individual school it's not at all reliable to assume that good 'Progress' necessarily results in tough 'Admissions'.
(Use the menus below to explore other correlations and hover over the dots to see corresponding values. Click on the figure legend to turn on or off individual regions. You can also click on the dots in the correlation matrix in Figure 1 above to view the corresponding plot in Figure 2 below.)
Figure 2: Indicators by school and region
One nation?
Finally, let's look at how the full range of school indicators – for both primary and secondary schools – vary by region. This is shown in Figure 3.
For primary schools, 'Admissions' show a fairly clear north-south divide, with more capacity in the south, especially in London. But this picture is less clear-cut for secondary schools.
'Attainment' is also highest in London among both primary and secondary schools, though for primary schools the North East also does admirably well, especially given that, like London, it has very high proportions of disadvantaged pupils. 'Progress' shows a broadly similar pattern, with high rates for primary schools in London and the North East, but only London prevailing among secondary schools. The same is also true for post-secondary 'Destinations'.
'Attendance' at primary schools is relatively even across the country, but at secondary schools it is lowest in the North East, including Yorkshire and The Humber, and in the South West.
'Disadvantaged' primary pupils do best in London and the North East (both of which, as already mentioned, have lots of poor kids). Southern regions do noticeably poorly. However, among secondary schools the North East shows the lowest average score in the country, with London once again number one.
Perhaps unsurprisingly, London's Achilles heel is the 'Environment', with relatively high average pollution and crime levels. In this regard, the North East does best of all. (Of course, this measure is essentially the same across both primary and secondary phases.)
'Finances' are weakest in the South West, especially for primary schools, but also for secondaries.
'Representation' is best in the South West, for both primary and (especially) secondary schools, though it's important to note that this is at least in part because the populations there are relatively non-diverse to begin with, especially in terms of ethnicity.
In contrast, the South West performs worst of all on the 'Sixth form' indicator, with London comfortably top, followed by the West Midlands.
Figure 3: Average Schools Guide score by region
So which region is 'best'? As for individual schools, there are clear trends, but interpreting them depends on what you value. London has the best overall academic performance, especially for disadvantaged pupils, but also the worst environment. Schools in the South West show less segregation but weaker finances and sixth forms. The North East is great for primary education, but has lower-performing secondary schools.
Importantly, each regional average also hides a great deal of variation between individual schools. The Schools Guide is designed to help you understand these for yourself, guided by your own circumstances and priorities.
Of course, all of this is relevant not only to personal school choice, but also national education policy and the government's levelling-up agenda. And it hardly scratches the surface of the insights that The Schools Guide data can offer – not least from its collection of 'Staff' indicators, which we have left entirely unexamined here. These and other aspects will be the subjects of future posts.
As always, we welcome your comments and suggestions: [email protected]
What Oak National Academy usage tells us about education during the pandemic
5th November 2021 by Timo Hannay [link]
Update 5th November 2021: Here is some TES coverage of our analysis.
Oak National Academy is a free, government-funded online service established in April 2020 in response to school closures following early outbreaks of COVID-19 in the United Kingdom. At the time of writing, it provides over 40,000 resources of various kinds designed to support the education of children aged 4 to 16. Earlier this year, at Oak's request, we conducted an analysis of their usage data. This post describes what we found and what these results tell us about England's education system during the pandemic. The data cover the period from 1st January to 31st May 2021.
Some of the results described here have appeared previously in Oak's 2020/21 Annual Report. All the data presented are national or regional aggregates and are anonymous with respect to individual pupils, teachers and schools.
To summarise our main findings:
- During the early 2021 peak, Oak provided well over 10,000 resource downloads for teachers each day, as well as more than 3 million daily online lessons for pupils. While online lessons inevitably fell to much lower levels following school re-openings in March, teachers continued to download online resources in large numbers, maintaining about a third of levels seen during lockdown.
- Usage was broadly based. Over half of all schools in England, and nearly three-quarters of secondary schools, logged at least some teacher activity during the period analysed. Coverage was higher among state schools (54%) than independent schools (39%).
- Teachers and pupils in poorer areas appeared to make disproportionately heavy use of Oak resources, though there is also evidence that certain types of engagement were lower among more disadvantaged children.
- Computers, as opposed to mobile phones and other devices, were less likely to be used by pupils in poorer areas, and sessions conducted on mobile phones were only about a quarter as long as those on computers. Furthermore, even those poorer pupils who were using computers tended to show shorter session lengths.
- Across England, Google Classroom was a more popular choice than Microsoft Teams for teachers to share material with pupils. This was particularly true in London and the south, where Google dominated. Only in the East Midlands was Microsoft ahead.
Open and shut case study
Figure 1 shows teacher and pupil activity levels throughout the period covered in this analysis. There are clear weekly cycles, with peaks at the beginning or middle of each week and much lower activity during the weekends. For pupil lesson starts, overall activity declines from a maximum of nearly 3.6 million a day in early January to just under 1.4 million at the beginning of March, with a temporary dip during the half-term holiday in mid-February. (This general decline is similar to the within-term trends we have seen for other online learning resources.) Following the re-opening of schools in early March, activity inevitably falls to much lower levels, with weekly peaks in the 50,000-100,000 lessons-a-day range.
Teacher shares show a different pattern, with peaks near the start and end of each half term. Finally, by comparison with the other measures shown here, teacher downloads declined much less following school re-openings, maintaining about a quarter to a third of their lockdown levels. This suggests that teachers continued to make use of the resources provided for face-to-face teaching even when pupils were no longer studying primarily online. This is consistent with unpublished Teacher Tapp survey results, which indicate that 45% of responding teachers in England used Oak during May-October 2021.
(Use the menu below to view different activity metrics. Hover over the lines to see corresponding data values.)
Figure 1: Pupil and teacher activity during early 2021
Cover teachers
Oak's teacher activity (unlike pupil activity, which we will analyse below) can be assigned to particular schools by linking it to the corresponding registrant data. Figure 2 shows the proportions of schools that logged at least some teacher activity between January and May 2021. Across all schools, there was a small degree of regional bias, with the North East and the South West showing lower coverage than London and the North West. A broadly similar pattern is evident when looking just at primary schools (unsurprising since they are easily the largest single group). Secondary schools showed more even coverage, but here too, some regions (the North East and West Midlands) recorded slightly lower levels than others (the East of England and South East).
Note also the much higher overall coverage among secondary schools. Across all schools (in all regions), about 53% were represented, but 'only' 49% of primary schools compared to 74% of secondary schools. This may be in part because secondary schools found Oak resources more relevant, but is also a consequence of them being much bigger, so the probability of at least one teacher at each school having used Oak is correspondingly higher. (This size effect is a general principle to keep in mind when interpreting the observations that follow.)
Among primary schools, coverage was greater among state schools, schools with higher levels of disadvantage, those located in poorer areas and those with higher proportions of pupils that have English as an additional language (EAL).
Secondary schools showed somewhat different patterns. Here too, Oak was much more heavily used in state schools and those with higher proportions of EAL pupils. But in contrast to primary schools, they also showed somewhat greater coverage among schools located in more affluent areas and those with higher Ofsted ratings. For secondaries, coverage was also higher among academies and certain types of faith school.
(Use the menus below to explore other school categorisations and phases. Hover over the columns to see underlying values and sample sizes.)
Figure 2: Proportions of schools showing teacher activity
Figure 3 breaks down the coverage data into two periods: during school closures (red columns) and after school re-openings (blue). This enables us to see how the balance of coverage altered when lockdown restrictions were lifted. In general, things changed roughly in proportion across different types of school, but there were some exceptions.
Among primary schools, state school coverage dropped by more than that for independent schools, somewhat evening out the previous imbalance. There was a similar, if smaller, effect by school deprivation level, but not by proportion of EAL pupils. Secondary schools showed a relative widening of the coverage gap by sector, but a narrowing by local deprivation level and Ofsted rating.
(Use the menus below to explore other school categorisations and phases. Hover over the columns to see underlying values and sample sizes. Click on the legend to show or hide individual data sets.)
Figure 3: Proportions of schools showing teacher activity
Poverty and participation
This section looks at how use of Oak resources varied by deprivation level. It uses both teacher and pupil activity measures. Since pupil activity is anonymous, we cannot link it to particular schools, but we do have information about the top-level postcode from which users accessed the site and can therefore tie this to local deprivation measures. In particular, we will use the Income Deprivation Affecting Children Index (IDACI), an official measure of childhood poverty. For the purposes of this analysis, each activity on the Oak website is assigned to an IDACI quintile (1 = lowest deprivation, 5 = highest deprivation), enabling us to examine any systematic differences between poorer and more affluent neighbourhoods.
Figure 4 shows the same usage metrics we saw in Figure 1, but divided up by IDACI quintile. Activity is expressed 'per thousand pupils' in order to allow for the the different numbers of schoolchildren in each quintile. Note that the total numbers of pupils used for this calculation are derived from school rolls, while most accesses will have been from each child's home address. These two locations (school and home) are not necessarily in the same quintile, so this analysis is necessarily approximate.
In general, poorer areas saw higher levels of activity, whether teacher downloads, teacher shares or pupil lesson starts. Relative levels of pupil activity were roughly in line with teacher activity. Notwithstanding the findings about school coverage described above, this suggests that schools and pupils in poorer areas made disproportionately heavy use of Oak resources.
(Use the menus below to explore other activities. Hover over the columns to see underlying values.)
Figure 4: Pupil and teacher usage by local childhood deprivation level
There was, however, a tendency for pupils in poorer areas to appear somewhat less engaged, as shown in Figure 5. Specifically, these children were somewhat less likely to view videos to the half-way point or all the way to the end, though this effect was not particularly large.
(Use the menus below to explore other activities. Hover over the columns to see underlying values and sample sizes.)
Figure 5: Pupil and teacher usage by local childhood deprivation level
Technology information
A common source of concern while schools were closed was the ability of pupils, especially poorer ones, to access suitable devices for participating in distance learning. Figure 6 shows the proportion of Oak lessons that were begun on a computer (as opposed to a tablet, mobile or other device such as a smart TV or games console). As for the usage data in Figure 1, there is is a weekly cycle. This is somewhat curious because it inidcates that even when children were stuck at home, they were more likely to use a computer during the week (though this trend was certainly more pronounced after early March, once schools had re-opened).
In general, about 65%-75% of lessons sat during the early-2021 lockdown were started on a computer, and this rose by 5-10 percentage points when pupils returned to school – though, as we have already seen, the number of online lessons dropped at the same time.
(Hover over the line to see corresponding data values.)
Figure 6: Proportion of lessons started on a computer
Perhaps unsurprisingly, this picture is not uniform across the country. Figure 7 shows the proportions of lessons begun on a computer, segmented by local IDACI quintile. There was a very clear trend, with computer usage lower in in poorer neighbourhoods. This imbalance was greatest during the school closures (red columns) and reduced, but not eliminated, after schools re-opened (blue).
(Hover over the columns to see underlying values. Click on the legend to show or hide individual data sets.)
Figure 7: Proportion of lessons started on a computer, by local deprivation rate
These technological disparities may be relatively unimportant if they do not materially affect children's ability to engage in learning. Intuitively, mobile phones, with their smaller screens and reduced processing power, seem much less suitable than computers or even tablet devices. But are there any objective data to support this belief? Yes. Figure 8 shows the median session length by device type. (These are are normalised so that the average value for computer users during the January-March school closures is 100.) Mobile users had sessions only 20-30% as long as those of computer users. Session length for tablet users was closer to that for computer users than that for mobile users, but was nevertheless lower, especially during lockdown and weekends.
(Hover over the lines to see corresponding data values. Click on the legend to show or hide individual data sets.)
Figure 8: Session length by device type
Furthermore, poverty seems to have additional effects beyond those that we might attribute solely to technology access. Figure 9 shows that even those poorer pupils who did use computers or tablets tended to have shorter session lengths, whether schools were closed or open. Unfortunately we can't tell from these data alone what might have caused such disparities or what, if any, educational impact they had.
(Use the menu below to switch between open and closed periods. Hover over the columns to see underlying values.)
Figure 9: Session length by local deprivation rate
Digital duopoly
Finally, Figure 10 shows the proportions of teacher shares that used either Google Classroom or Microsoft Teams – which between them accounted for the vast majority of such activity. Across England, Google Classroom was the most popular choice, overwhelmingly so in London and the south. Only in the East Midlands did Microsoft trump them.
(Hover over the columns to see underlying values.)
Figure 10: Proportion of teacher shares by platform
Special thanks to our collaborators at Oak National Academy for supporting this analysis and allowing us to publish the results. We hope you find them useful. As always, we welcome your feedback: [email protected].