The press have given considerable attention to a paper by Simon Burgess, Professor of Economics at Bristol “Understanding the success of London’s schools”. CMPO Working Paper Series No. 14/333, University of Bristol, October 2014.
The BBC splashed it as: Diversity 'key to London GCSE success'
http://www.bbc.co.uk/news/education-30002991
The Guardian as: London’s GCSE success due to ethnic diversity in capital’s schools
http://www.theguardian.com/uk-news/2014/nov/12/london-gcse-success-ethnic-diversity-schools
The Daily Mail: Ethnic diversity 'boosts GCSE results': Cities with large numbers of children from immigrant backgrounds do better because they work harder. Schools in London and Birmingham have good results due to minority pupils. White British students make slower progress as they are less ambitious. Bristol University found that ethnic minorities have greater expectations
http://www.dailymail.co.uk/news/article-2830862/Ethnic-diversity-boosts-GCSE-results-Cities-large-numbers-children-immigrant-backgrounds-better-work-harder.html
The Times of London used it in an editorial (12 November 2014) “High Class Immigrants: Research shows that migrants do well at school and help the locals too”. They admit the research has yet to be peer reviewed, so I am stepping into the breach, with nothing but the public interest at heart, as always.
As you will see, the Press have tended to suggest that success is due to diversity, and that immigrants are improving outcomes, rather than schools are improving outcomes. Let us have a look at the actual paper, the link to it being shown below.
http://www.bristol.ac.uk/cmpo/publications/papers/2014/wp333.pdf
Prof Burgess argues:
We showed some time ago that ethnic minority pupils make better progress through school than white British pupils (see Wilson et al (2005, 2011) and Burgess et al (2009)). Given that these pupils typically live in more disadvantaged neighbourhoods and come from poorer families, their advantages must be less material than books, educational visits and computers. It is argued that ethnic minority pupils have greater ambition, aspiration, and work harder in school. This is the main argument here – London has more of these pupils and so has a higher average GCSE score than the rest of the country.
..there is a London premium in pupil progress of 9.8% of a standard deviation. I show that ethnic composition matters a great deal: in fact, differences in composition account for all of the gap. If London had the same ethnic composition as the rest of England, there would be no ‘London Effect’. Furthermore, there is no significant difference between the progress of white British pupils in London and in the rest of the country. Looking at conditional pupil progress, a London premium of 11% is also entirely eliminated by controls for ethnicity; this is also robust to conditioning on pupil and neighbourhood characteristics. Nor is this a new phenomenon: the London progress premium has existed for the last decade and is entirely accounted for by ethnic composition in each year.
Comment: “9.8% of a standard deviation” is hard to understand. In an intelligence test it would be equivalent to 1.5 IQ points. In terms of overall mean GCSE scores I calculate, from other data, that it would be 1.8 points out of an average of roughly 42.3 points.
There is nothing inherently different in the educational performance of pupils from different ethnic backgrounds, but the children of relatively recent immigrants typically have greater hopes and expectations of education, and are, on average, consequently likely to be more engaged with their school work. These results help to explain the ‘London Effect’; they do not explain it away. My argument is that the London effect is a very positive thing, but much of the praise for this should be allocated to the pupils and parents of London for creating a successful multi-ethnic school system. By the same token, there is less evidence that education policies and practices had a large part to play in terms of innovative policies.
The claim that “There is nothing inherently different in the educational performance of pupils from different ethnic backgrounds” does not accord with most research on scholastic ability, but that is what makes the paper more intriguing. It appears to be asserting something which is contradicted by the actual UK results as published by the relevant government statisticians, and does not accord with international data.
To get down to the detail, Burgess has used the standard “Best 8” procedure: the scores on the best 8 GCSE exams from the National Pupil Database 2012/13 are used to assess scholastic attainment. GCSE results are given in grades which coincidentally go from 8 for a A*, 7 for an A and so on down to 1, a system which loses all the fine detail of the actual percentage results, and also potentially penalises those brighter students who take many examinations (Burgess says he followed this procedure because it does not “over-reward” such students, though it blunts the achievements of bright and diligent students), all this without really controlling for course difficulty. Deary et al. (2007) give the full results, as well as the best 8, and show the detailed results for each major exam, which is very instructive. That paper shows, among other things, that individual sciences are taken by very few pupils. Sadly, the Deary et al. publication also has to use the crude grading system, which is a regrettable consequence of grading. What is the point of examiners marking papers in detail and then the system trashing the results by reducing them to grades?
But the main results of Deary et al. are salutary for any researcher seeking to talk about scholastic achievement. They did a 5-year prospective longitudinal study of 70,000+ English children looking at the association between psychometric intelligence at age 11 years and educational achievement in national examinations in 25 academic subjects at age 16. The correlation between a latent intelligence trait (Spearman's g from CAT2E) and a latent trait of educational achievement (GCSE scores) was 0.81. General intelligence contributed to success on all 25 subjects. Variance accounted for ranged from 58.6% in Mathematics and 48% in English to 18.1% in Art and Design. Girls showed no advantage in g, but performed significantly better on all subjects except Physics. This was not due to their better verbal ability. At age 16, obtaining five or more GCSEs at grades A–C is an important criterion. 61% of girls and 50% of boys achieved this. For those at the mean level of g at age 11, 58%achieved this; a standard deviation increase or decrease in g altered the values to 91% and 16%, respectively.
Simply stated, if you want to talk about the causes of GCSE results at 16 you ought to quote this paper and you ought to distinguish between psychometric intelligence at 11 and educational attainment thereafter.
Burgess then goes on to explain: The best way to isolate the contribution of schools, and by extension a city-wide school system, is to analyse pupil progress: to see how well pupils do at GCSE taking account of their prior test scores before entering secondary schools. This necessarily focusses attention on secondary schools (see Greaves et al 2014 for a discussion of primary schools). The prior test scores are each pupil’s performance in the Key Stage 2 tests at age 11, in English, Maths and Science. I define pupil progress as the residual of a regression of GCSE capped 8 points score conditional on these KS2 test scores.(my emphasis)
So, when Burgess talks about “progress” it is not progress as we might usually understand it, in the sense that students are judged by how far they have progressed from ignorance to knowledge (achievement), but how far they have progressed given their earlier achievements. A child who has done badly at primary school but has then improved at secondary school will be judged to have made greater “pupil progress” than a much higher-performing child who remains at the top of the class throughout their schooling. Burgess does a further version, “conditional pupil progress” which includes a correction for poverty, as measured by being eligible for free school meals, but the main problem still stands. (Progress estimates are also compounded by the questionable assumptions of the “poverty correction”, but we have covered that many times before).
The paper is not about the final achievement of the pupils, but about their progress when allowance is made for their prior ability (as demonstrated by early assessments at age 11). We seem to have confusion between “progress” (achieves a high standard) and “progress” (improves when earlier ability is taken into account). The former is of benefit to society (school leavers able to work or do further study), the latter is one way of estimating whether secondary schools add value, all things considered. Measuring cognitive ability on a “school far” test would be a better way of getting a baseline for later estimating added value in scholastic achievement, but unlike the Deary paper, no cognitive estimates are shown in this paper.
You might, at this point, wish to stop reading.
Burgess is trying to calculate the value that secondary schools add to children’s scholastic achievements. This is a valid exercise, particularly if those schools are talking up their achievements, and boasting without proofs. However, “pupil progress” does not entirely achieve that aim. In fact, the schools may be adding lots of value, but getting results which are in line with predictions based on ability.
To explain this, in words short enough to be understood by science journalists and the leader writers of The Times (at one time a paper of record), consider the following. A bright child, more scholastically able than 80% of their class finishes primary school with good marks. The child doesn’t know very much, but they are learning with each year of education. They go on to secondary school. At age 16 the child is still more scholastically able than 80% of their class but they have learned a lot more. Using this type of pupil progress as a measure will make it seem as if they have not made any progress. In the jargon, there are no residuals from the regression line. They are doing no better than expected, though they are showing more knowledge and more developed skills in actual fact. The progress measure does not show us what level they have progressed to, but only the distance they have travelled according to various assumptions, including assumptions about poverty.
Of course, only a fool would think that children hadn’t learned anything because they had progressed up the system at their usual speed.
Now consider a child who does less well at primary school, perhaps because they are slow to mature, or a recent immigrant. At 11 they are in the bottom 20% of the class. At secondary school they mature, or in the case of immigrants, learn English. Now they do better, and rise to the 30th percentile of the class, a massive 50% improvement. Bingo, they have made a brilliant contribution to the progress score for the school. If the school is being judged on this sort of “progress” it would be smart to find lots of young immigrants who have much to learn.
So, if cities like Birmingham and London have lots of recent immigrants, those groups will do poorly at primary school but, as they become acculturated to England, may subsequently do better. The White British locals will already be acculturated, so there is no progress for them to make on that front. This supposition is confirmed in figure F2 which shows that the higher the non-white population the higher the “progress” score.
Burgess goes on to explain that Birmingham shows a greater “London effect” than London. Of course, intelligence research usually shows that brighter people move to cities, but I think that this particular finding is an artefact of the progress measures being used in this paper. Having lots of recent immigrants will increase the likelihood of apparent progress at later ages.
If you look at the “large city versus rest of England” contrasts in the paper, Birmingham is far ahead, London slightly less so, Manchester a bit behind the national average, and Liverpool very close to it. It may be due to the proportion of Asian pupils (look at Table T5).
Burgess also explains that those immigrants who did not complete the 11 year old assessments (perhaps because they arrived as teenagers) were dropped from the analysis, so we cannot judge the progress of this minority. They tended to have lower GCSE scores, possibly because they were late to acculturate.
In fact, a real measure of progress would be to find a test or broad range of tests which could be given throughout early life and into adulthood. Perhaps a wide-ranging general knowledge and skills evaluation (roughly like those carried out by the OECD) would show how well a pupil had been prepared for earning a living.
http://drjamesthompson.blogspot.co.uk/2013/10/how-illiterate-is-oecd.html
You will note that the OECD is surprised and concerned to find that at the end of formal education large numbers of people cannot do very much in the economy.
There is something missing from the Burgess paper, which is to answer the question: How good are pupils’ scores at the end of secondary education? It is very hard to find the answer to this question in the paper. I think, but cannot be sure, that the answer is given in Appendix 3 on page 33, which may be further than most science journalists are willing to read, assuming they have read the paper at all.
Burgess has chosen to show all the GCSE totals in standardised scores, which makes them harder to interpret. Plain statistics are always preferable, and the actual scores would allow immediate comparison with other publications, whereas standard scores obscure those key benchmarks. The standardised scores also obscure the pass rates which, as we will see later, are a major cause of distortions in reporting school progress and racial gaps in achievement.
It seems that the highest achieving students are Pakistani, then Black Caribbean and White British and Bangladeshi, and the lowest performing are the Chinese. Obviously, I have made a mistake in reading this table, so I turned from the paper to the latest Government statistics for the relevant year, 2013.
https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/280689/SFR05_2014_Text_FINAL.pdf
One of the headlines is: Chinese pupils remain the highest attaining ethnic group. The percentage of Chinese pupils achieving 5 or more GCSEs at grade A* to C or equivalent including English and mathematics is 17.5 percentage points above the national average.
This is in line with everything we know about the intellectual and scholastic ability of the Chinese. I have apparently read the table back to front. However, if Appendix 3 is correct, then in the UK in 2013 75% of students get a pass mark, and have an equivalent IQ of 110. Time to go to other sources of final GCSE statistics.
Here is the data on scholastic attainment in 2011. It gives the results in 2007 and 2011. Without intending to, it also shows how the gap between ethnic groups can be manipulated by making exams easier, which can also be done by giving the best results without requiring that they include English and Maths.
http://drjamesthompson.blogspot.co.uk/2013/11/can-economist-be-trusted-on-intelligence.html
To save time, here are the scores in that posting:
To see how the more recent test 2011 results look with an easy 58% pass rate, concentrate on the higher maroon histograms on the right of each pair. The Chinese and Indians are ahead, the rest gradually falling near or behind the White British level, with Black Caribbeans last except for the small numbers of Roma. Now look at the earlier 2007, harder test results with a 45% pass rate marked in light purplish blue. Notice how the Chinese and Indians are still ahead but the other groups are in more difficulty. The scores are proportions passing, not the actual scores, which would show the Chinese even further ahead. Good trick, isn’t it?
La Griffe du Lion explained how this was done in 2004, and educators have not been shamed into dropping it. If you make the pass rate a little higher every year by making the test easier, for several years you will get an apparent closing of the gap, without any fundamental change in the scores. This is because the apparent percentage gap is a function of the two bell curves and the pass mark which is being used as a cut-off.
Consider two populations, the one shown above being better at scholastic achievement than the one shown below, such that 50% of the top group can pass an exam and only 16% of the bottom group pass that same exam. There is a mean difference in scholastic attainment, shown by comparing the distance between the two means. The newspaper headline figure, for those who are not used to looking at normal distributions, is that there is a 50% - 16% = 34 point gap. That makes a good headline, even though it depends on a particular pass rate, and ignores the best measure, which is the mean difference shown in the figure above.
At that point, if you are an educationalist with a political position, announce you are going to transform the educational system (as Bush did in Texas in the 90s). Now, without changing the schools or the teachers, change the pass rate slightly, either by making the exam slightly easier or just passing more children with lower marks, or a bit of both. Keep doing that every year and the apparent percentage gap will eventually come down to 16% - 2% = 14 point gap. Of course, the actual ability levels have not changed, and the areas under the normal curve have not changed, but by moving the cut-off point and using the misleading point-gap statistic you can probably fool most journalists. (Once you have got to the end of the curve the trick runs out of steam so, flushed with success, you move to another school district at a higher salary and repeat the trick).
http://www.lagriffedulion.f2s.com/gap.htm
I think it is time to attempt a summing up. As far as I can see, the whole of the UK press, in company with Prof Burgess some of the time, has misunderstood what was being measured and has drawn conclusions which are unsubstantiated, and very probably wrong. It is alarming to conclude this, so I welcome anyone who can point out my mistakes and misunderstandings. (I generally ask authors for comments anyway, and post up their replies without any further comment). I think they all got it wrong, utterly wrong, but I may be mistaken.
The proper and fully validated conclusion should be:
“Progress” measures do not equate to scholastic achievement, so this paper does not inform you about final achievements at age 16. If you want to find out about those, read government statistics, though those can be confusing. Furthermore, these data do not allow you to make assumptions about the amount of effort students are putting into their work (which was not measured), and whether immigrants are desperate to get ahead. They may be, but this paper cannot confirm that. The findings could well be an artefact of the progress measures used, because a low starting point leads to more progress, even if the end result is average.
I don’t do policy, but here is some advice for Head teachers and education authorities.
Schools will get good achievement results if they can get bright pupils. About 65% of the variance in scholastic attainment is due to prior intelligence. If Head teachers and education authorities want to be totally cynical, here is some advice: If you are allowed to give children intelligence tests, use those to select your pupils. Failing that, if you can look at their prior achievements, use those. If are denied the right to choose on that basis, find children with well-educated parents, even if they are very poor. Try to pack your school with such children, whatever their race. The educational level of parents is an intelligence surrogate measure, and a better predictor than wealth. If you are not allowed to select on parental intelligence, pack in as many Chinese children as possible. Then select Indians/Asians of professional rank, and all Irish, and Whites. Avoid other groups, particularly Roma. Your school will look good in terms of final results. Be highly selective in your “diversity”. If inspectors come to call, show Chinese and Indian students staring down a microscope.
On the other hand, if you want to be even more cynical, and want to be judged not on the final achievements of your students, but on a measure of their progress, make sure you find children with a low starting point in primary school. Find any child whose parents are poor (because “adjusting” for poverty boosts their scores regardless of the cause of poverty). Pack the school with recent immigrants who cannot speak English and who have not adjusted to life in England. Their low scores will make you look good, because with every passing day they will watch TV, speak to English kids, walk the streets and read billboards, newspapers and listen to radio. As they acculturate it is likely they will do better at school, if not in absolute terms, then in the more elastic relative terms. For really dramatic results, try to avoid Chinese children. They are bright to begin with, and on your dodgy progress measures they won’t show much progress.
In conclusion, it is a great pity Prof Burgess’s paper did not contain any psychometrics, which would have fleshed out his argument. It is also a pity that he has allowed himself (some of the time) and his listeners (virtually all of the time) to conflate progress with achievement. He has sought to make a particular point: if student progress (allowing for previous achievement) is the criterion then the London effect is spurious, and probably due to immigrants doing better in secondary school than primary school. One cannot attribute to the quality of schooling results which are probably due to immigrants showing progress from a low level in primary schools to a higher level in secondary schools as they get used to the local white culture.
The progress-in-the-light-of-former-achievements measure should a) mention the proportions of recent and more established immigrants, and b) should be discussed in the more important context of the end results: GCSE results by ethnic group. What has happened is that Prof Burgess has drifted from making his first point into making an un-validated and incorrect second point: that immigrants boost school performance. The official statistics show that it depends on which immigrants. Chinese students will do wonders for attainments, Black Caribbeans far less so, Roma not at all.
Economics has been called the dismal science, but the reception given to this paper reveals the dismal level of science reporting in the United Kingdom, certainly as regards psychology.
Please reassure me I am not the only person in the world who detects fatal errors in the conclusions drawn so enthusiastically from this paper by so many journalists.
Comments please.
Deary, I. J., S. Strand, P. Smith and C. Fernandes (2007) Intelligence and educational achievement. Intelligence 35, 1, pp13-21.