Friday 14 November 2014

Immigrants, scholastic ability, and journalistic ability


The press have given considerable attention to a paper by Simon Burgess, Professor of Economics at Bristol “Understanding the success of London’s schools”. CMPO Working Paper Series No. 14/333, University of Bristol, October 2014.

The BBC splashed it as: Diversity 'key to London GCSE success'


The Guardian as: London’s GCSE success due to ethnic diversity in capital’s schools

The Daily Mail: Ethnic diversity 'boosts GCSE results': Cities with large numbers of children from immigrant backgrounds do better because they work harder. Schools in London and Birmingham have good results due to minority pupils. White British students make slower progress as they are less ambitious. Bristol University found that ethnic minorities have greater expectations

The Times of London used it in an editorial (12 November 2014) “High Class Immigrants: Research shows that migrants do well at school and help the locals too”. They admit the research has yet to be peer reviewed, so I am stepping into the breach, with nothing but the public interest at heart, as always.

As you will see, the Press have tended to suggest that success is due to diversity, and that immigrants are improving outcomes, rather than schools are improving outcomes. Let us have a look at the actual paper, the link to it being shown below.

Prof Burgess argues:

We showed some time ago that ethnic minority pupils make better progress through school than white British pupils (see Wilson et al (2005, 2011) and Burgess et al (2009)). Given that these pupils typically live in more disadvantaged neighbourhoods and come from poorer families, their advantages must be less material than books, educational visits and computers. It is argued that ethnic minority pupils have greater ambition, aspiration, and work harder in school. This is the main argument here – London has more of these pupils and so has a higher average GCSE score than the rest of the country.

..there is a London premium in pupil progress of 9.8% of a standard deviation. I show that ethnic composition matters a great deal: in fact, differences in composition account for all of the gap. If London had the same ethnic composition as the rest of England, there would be no ‘London Effect’. Furthermore, there is no significant difference between the progress of white British pupils in London and in the rest of the country. Looking at conditional pupil progress, a London premium of 11% is also entirely eliminated by controls for ethnicity; this is also robust to conditioning on pupil and neighbourhood characteristics. Nor is this a new phenomenon: the London progress premium has existed for the last decade and is entirely accounted for by ethnic composition in each year.

Comment: “9.8% of a standard deviation” is hard to understand. In an intelligence test it would be equivalent to 1.5 IQ points. In terms of overall mean GCSE scores I calculate, from other data, that it would be 1.8 points out of an average of roughly 42.3 points.

There is nothing inherently different in the educational performance of pupils from different ethnic backgrounds, but the children of relatively recent immigrants typically have greater hopes and expectations of education, and are, on average, consequently likely to be more engaged with their school work. These results help to explain the ‘London Effect’; they do not explain it away. My argument is that the London effect is a very positive thing, but much of the praise for this should be allocated to the pupils and parents of London for creating a successful multi-ethnic school system. By the same token, there is less evidence that education policies and practices had a large part to play in terms of innovative policies.

The claim that “There is nothing inherently different in the educational performance of pupils from different ethnic backgrounds” does not accord with most research on scholastic ability, but that is what makes the paper more intriguing. It appears to be asserting something which is contradicted by the actual UK results as published by the relevant government statisticians, and does not accord with international data.

To get down to the detail, Burgess has used the standard “Best 8” procedure: the scores on the best 8 GCSE exams from the National Pupil Database 2012/13 are used to assess scholastic attainment. GCSE results are given in grades which coincidentally go from 8 for a A*, 7 for an A and so on down to 1, a system which loses all the fine detail of the actual percentage results, and also potentially penalises those brighter students who take many examinations (Burgess says he followed this procedure because it does not “over-reward” such students, though it blunts the achievements of bright and diligent students), all this without really controlling for course difficulty. Deary et al. (2007) give the full results, as well as the best 8, and show the detailed results for each major exam, which is very instructive. That paper shows, among other things, that individual sciences are taken by very few pupils. Sadly, the Deary et al. publication also has to use the crude grading system, which is a regrettable consequence of grading. What is the point of examiners marking papers in detail and then the system trashing the results by reducing them to grades?

But the main results of Deary et al. are salutary for any researcher seeking to talk about scholastic achievement. They did a 5-year prospective longitudinal study of 70,000+ English children looking at the association between psychometric intelligence at age 11 years and educational achievement in national examinations in 25 academic subjects at age 16. The correlation between a latent intelligence trait (Spearman's g from CAT2E) and a latent trait of educational achievement (GCSE scores) was 0.81. General intelligence contributed to success on all 25 subjects. Variance accounted for ranged from 58.6% in Mathematics and 48% in English to 18.1% in Art and Design. Girls showed no advantage in g, but performed significantly better on all subjects except Physics. This was not due to their better verbal ability. At age 16, obtaining five or more GCSEs at grades A–C is an important criterion. 61% of girls and 50% of boys achieved this. For those at the mean level of g at age 11, 58%achieved this; a standard deviation increase or decrease in g altered the values to 91% and 16%, respectively.

Simply stated, if you want to talk about the causes of GCSE results at 16 you ought to quote this paper and you ought to distinguish between psychometric intelligence at 11 and educational attainment thereafter.

Burgess then goes on to explain: The best way to isolate the contribution of schools, and by extension a city-wide school system, is to analyse pupil progress: to see how well pupils do at GCSE taking account of their prior test scores before entering secondary schools. This necessarily focusses attention on secondary schools (see Greaves et al 2014 for a discussion of primary schools). The prior test scores are each pupil’s performance in the Key Stage 2 tests at age 11, in English, Maths and Science. I define pupil progress as the residual of a regression of GCSE capped 8 points score conditional on these KS2 test scores.(my emphasis)

So, when Burgess talks about “progress” it is not progress as we might usually understand it, in the sense that students are judged by how far they have progressed from ignorance to knowledge (achievement), but how far they have progressed given their earlier achievements. A child who has done badly at primary school but has then improved at secondary school will be judged to have made greater “pupil progress” than a much higher-performing child who remains at the top of the class throughout their schooling. Burgess does a further version, “conditional pupil progress” which includes a correction for poverty, as measured by being eligible for free school meals, but the main problem still stands. (Progress estimates are also compounded by the questionable assumptions of the “poverty correction”, but we have covered that many times before).

The paper is not about the final achievement of the pupils, but about their progress when allowance is made for their prior ability (as demonstrated by early assessments at age 11). We seem to have confusion between “progress” (achieves a high standard) and “progress” (improves when earlier ability is taken into account). The former is of benefit to society (school leavers able to work or do further study), the latter is one way of estimating whether secondary schools add value, all things considered. Measuring cognitive ability on a “school far” test would be a better way of getting a baseline for later estimating added value in scholastic achievement, but unlike the Deary paper, no cognitive estimates are shown in this paper.

You might, at this point, wish to stop reading.

Burgess is trying to calculate the value that secondary schools add to children’s scholastic achievements. This is a valid exercise, particularly if those schools are talking up their achievements, and boasting without proofs. However, “pupil progress” does not entirely achieve that aim. In fact, the schools may be adding lots of value, but getting results which are in line with predictions based on ability.

To explain this, in words short enough to be understood by science journalists and the leader writers of The Times (at one time a paper of record), consider the following. A bright child, more scholastically able than 80% of their class finishes primary school with good marks. The child doesn’t know very much, but they are learning with each year of education. They go on to secondary school. At age 16 the child is still more scholastically able than 80% of their class but they have learned a lot more. Using this type of pupil progress as a measure will make it seem as if they have not made any progress. In the jargon, there are no residuals from the regression line. They are doing no better than expected, though they are showing more knowledge and more developed skills in actual fact. The progress measure does not show us what level they have progressed to, but only the distance they have travelled according to various assumptions, including assumptions about poverty.

Of course, only a fool would think that children hadn’t learned anything because they had progressed up the system at their usual speed.

Now consider a child who does less well at primary school, perhaps because they are slow to mature, or a recent immigrant. At 11 they are in the bottom 20% of the class. At secondary school they mature, or in the case of immigrants, learn English. Now they do better, and rise to the 30th percentile of the class, a massive 50% improvement. Bingo, they have made a brilliant contribution to the progress score for the school. If the school is being judged on this sort of “progress” it would be smart to find lots of young immigrants who have much to learn.

So, if cities like Birmingham and London have lots of recent immigrants, those groups will do poorly at primary school but, as they become acculturated to England, may subsequently do better. The White British locals will already be acculturated, so there is no progress for them to make on that front. This supposition is confirmed in figure F2 which shows that the higher the non-white population the higher the “progress” score.

Burgess goes on to explain that Birmingham shows a greater “London effect” than London. Of course, intelligence research usually shows that brighter people move to cities, but I think that this particular finding is an artefact of the progress measures being used in this paper. Having lots of recent immigrants will increase the likelihood of apparent progress at later ages.

If you look at the “large city versus rest of England” contrasts in the paper, Birmingham is far ahead, London slightly less so, Manchester a bit behind the national average, and Liverpool very close to it. It may be due to the proportion of Asian pupils (look at Table T5).

Burgess also explains that those immigrants who did not complete the 11 year old assessments (perhaps because they arrived as teenagers) were dropped from the analysis, so we cannot judge the progress of this minority. They tended to have lower GCSE scores, possibly because they were late to acculturate.

In fact, a real measure of progress would be to find a test or broad range of tests which could be given throughout early life and into adulthood. Perhaps a wide-ranging general knowledge and skills evaluation (roughly like those carried out by the OECD) would show how well a pupil had been prepared for earning a living.

You will note that the OECD is surprised and concerned to find that at the end of formal education large numbers of people cannot do very much in the economy.

There is something missing from the Burgess paper, which is to answer the question: How good are pupils’ scores at the end of secondary education? It is very hard to find the answer to this question in the paper. I think, but cannot be sure, that the answer is given in Appendix 3 on page 33, which may be further than most science journalists are willing to read, assuming they have read the paper at all.



Burgess has chosen to show all the GCSE totals in standardised scores, which makes them harder to interpret. Plain statistics are always preferable, and the actual scores would allow immediate comparison with other publications, whereas standard scores obscure those key benchmarks. The standardised scores also obscure the pass rates which, as we will see later, are a major cause of distortions in reporting school progress and racial gaps in achievement.

It seems that the highest achieving students are Pakistani, then Black Caribbean and White British and Bangladeshi, and the lowest performing are the Chinese. Obviously, I have made a mistake in reading this table, so I turned from the paper to the latest Government statistics for the relevant year, 2013.

One of the headlines is: Chinese pupils remain the highest attaining ethnic group. The percentage of Chinese pupils achieving 5 or more GCSEs at grade A* to C or equivalent including English and mathematics is 17.5 percentage points above the national average.

This is in line with everything we know about the intellectual and scholastic ability of the Chinese. I have apparently read the table back to front. However, if Appendix 3 is correct, then in the UK in 2013 75% of students get a pass mark, and have an equivalent IQ of 110. Time to go to other sources of final GCSE statistics.

Here is the data on scholastic attainment in 2011. It gives the results in 2007 and 2011. Without intending to, it also shows how the gap between ethnic groups can be manipulated by making exams easier, which can also be done by giving the best results without requiring that they include English and Maths.

To save time, here are the scores in that posting:



To see how the more recent test 2011 results look with an easy 58% pass rate, concentrate on the higher maroon histograms on the right of each pair. The Chinese and Indians are ahead, the rest gradually falling near or behind the White British level, with Black Caribbeans last except for the small numbers of Roma. Now look at the earlier 2007, harder test results with a 45% pass rate marked in light purplish blue. Notice how the Chinese and Indians are still ahead but the other groups are in more difficulty. The scores are proportions passing, not the actual scores, which would show the Chinese even further ahead. Good trick, isn’t it?

La Griffe du Lion explained how this was done in 2004, and educators have not been shamed into dropping it. If you make the pass rate a little higher every year by making the test easier, for several years you will get an apparent closing of the gap, without any fundamental change in the scores. This is because the apparent percentage gap is a function of the two bell curves and the pass mark which is being used as a cut-off.



Consider two populations, the one shown above being better at scholastic achievement than the one shown below, such that 50% of the top group can pass an exam and only 16% of the bottom group pass that same exam. There is a mean difference in scholastic attainment, shown by comparing the distance between the two means. The newspaper headline figure, for those who are not used to looking at normal distributions, is that there is a 50% - 16% = 34 point gap. That makes a good headline, even though it depends on a particular pass rate, and ignores the best measure, which is the mean difference shown in the figure above.

At that point, if you are an educationalist with a political position, announce you are going to transform the educational system (as Bush did in Texas in the 90s). Now, without changing the schools or the teachers, change the pass rate slightly, either by making the exam slightly easier or just passing more children with lower marks, or a bit of both. Keep doing that every year and the apparent percentage gap will eventually come down to 16% - 2% = 14 point gap. Of course, the actual ability levels have not changed, and the areas under the normal curve have not changed, but by moving the cut-off point and using the misleading point-gap statistic you can probably fool most journalists. (Once you have got to the end of the curve the trick runs out of steam so, flushed with success, you move to another school district at a higher salary and repeat the trick).

I think it is time to attempt a summing up. As far as I can see, the whole of the UK press, in company with Prof Burgess some of the time, has misunderstood what was being measured and has drawn conclusions which are unsubstantiated, and very probably wrong. It is alarming to conclude this, so I welcome anyone who can point out my mistakes and misunderstandings. (I generally ask authors for comments anyway, and post up their replies without any further comment). I think they all got it wrong, utterly wrong, but I may be mistaken.

The proper and fully validated conclusion should be:

“Progress” measures do not equate to scholastic achievement, so this paper does not inform you about final achievements at age 16. If you want to find out about those, read government statistics, though those can be confusing. Furthermore, these data do not allow you to make assumptions about the amount of effort students are putting into their work (which was not measured), and whether immigrants are desperate to get ahead. They may be, but this paper cannot confirm that. The findings could well be an artefact of the progress measures used, because a low starting point leads to more progress, even if the end result is average.

I don’t do policy, but here is some advice for Head teachers and education authorities.

Schools will get good achievement results if they can get bright pupils. About 65% of the variance in scholastic attainment is due to prior intelligence. If Head teachers and education authorities want to be totally cynical, here is some advice: If you are allowed to give children intelligence tests, use those to select your pupils. Failing that, if you can look at their prior achievements, use those. If are denied the right to choose on that basis, find children with well-educated parents, even if they are very poor. Try to pack your school with such children, whatever their race. The educational level of parents is an intelligence surrogate measure, and a better predictor than wealth. If you are not allowed to select on parental intelligence, pack in as many Chinese children as possible. Then select Indians/Asians of professional rank, and all Irish, and Whites. Avoid other groups, particularly Roma. Your school will look good in terms of final results. Be highly selective in your “diversity”. If inspectors come to call, show Chinese and Indian students staring down a microscope.

On the other hand, if you want to be even more cynical, and want to be judged not on the final achievements of your students, but on a measure of their progress, make sure you find children with a low starting point in primary school. Find any child whose parents are poor (because “adjusting” for poverty boosts their scores regardless of the cause of poverty). Pack the school with recent immigrants who cannot speak English and who have not adjusted to life in England. Their low scores will make you look good, because with every passing day they will watch TV, speak to English kids, walk the streets and read billboards, newspapers and listen to radio. As they acculturate it is likely they will do better at school, if not in absolute terms, then in the more elastic relative terms. For really dramatic results, try to avoid Chinese children. They are bright to begin with, and on your dodgy progress measures they won’t show much progress.

In conclusion, it is a great pity Prof Burgess’s paper did not contain any psychometrics, which would have fleshed out his argument. It is also a pity that he has allowed himself (some of the time) and his listeners (virtually all of the time) to conflate progress with achievement. He has sought to make a particular point: if student progress (allowing for previous achievement) is the criterion then the London effect is spurious, and probably due to immigrants doing better in secondary school than primary school. One cannot attribute to the quality of schooling results which are probably due to immigrants showing progress from a low level in primary schools to a higher level in secondary schools as they get used to the local white culture.

The progress-in-the-light-of-former-achievements measure should a) mention the proportions of recent and more established immigrants, and b) should be discussed in the more important context of the end results: GCSE results by ethnic group. What has happened is that Prof Burgess has drifted from making his first point into making an un-validated and incorrect second point: that immigrants boost school performance. The official statistics show that it depends on which immigrants. Chinese students will do wonders for attainments, Black Caribbeans far less so, Roma not at all.

Economics has been called the dismal science, but the reception given to this paper reveals the dismal level of science reporting in the United Kingdom, certainly as regards psychology.

Please reassure me I am not the only person in the world who detects fatal errors in the conclusions drawn so enthusiastically from this paper by so many journalists.

Comments please.

Deary, I. J., S. Strand, P. Smith and C. Fernandes (2007) Intelligence and educational achievement. Intelligence 35, 1, pp13-21.


  1. Many thanks James for tackling this well-publicised piece of "research". It's very clear from Professor Burgess's comment "There is nothing inherently different in the educational performance of pupils from different ethnic backgrounds" that this work is motivated by political considerations. He therefore tries to obfuscate by not publishing the raw data, as you have found.

    I have not read this paper and hence can't claim to provide a thorough critique. But surely one point (the main point?) of relevance is given that scholastic tests always have an upper and lower bound of test scores, even a random score generator at both the younger and older ages will show the (randomly generated) average initial low-score "pupils" making much more "progress" than the (randomly generated) average initial high-score "pupils", some of whom can only go backwards relatively speaking. Given this principle, a researcher looking to play political tricks will seek to find a population of pupils (eg ethnic minorities) that is likely to score lowly on tests at a young age, and compare them with another group of pupils (eg the indigenous population, or even better, Chinese youngsters) that will heavily out-score them initially. The subsequent conclusion of the "research" is almost guaranteed I would imagine.

    1. PS: Upper and lower bounds come into play with even greater force with a grading system, which greatly narrows the variation in the data. Is it purely a coincidence that the Professor uses grades rather than percentage marks?

  2. Yes, there are apparent regression to the mean effects, but those are not the main issue here, I think. The assessments done at 11 and at 16 are spread out in terms of time and exam content, so are very probably a fair approximation to achievement. it is simply that the immigrant effect appears to coincide with an acculturation effect which does not boost school achievement of itself. Some ethnic groups boost the overall average, some don't. I have just published work on this, using international data, and will describe it later.

    1. Thanks. I don't know exactly how the research has been done of course, but to take an extreme example, if the grading system were confined to awards of either A or B at both ages, and all ethnic minorities were scored B initially, everyone else an A, then when the children are regraded at an older age, the progress of the former group is certain to be as good or better than the latter. That reflects entirely the nature of the examination system, of course, and has nothing to do with the changing abilities of the different groups of children. Any examination system will have these characteristics.

  3. one correction; Chinese pupils have apparently BOTH above-average "progress" and attainment. They manage this by doing slightly better than average (whites) at younger ages, and then doing a little bit better again at KS2, and then better again at GCSE (i.e the Chinese-remainder gap is present to start with but gets bigger over time). This could reflect any number of things; acculturation, the increasing heritability of IQ over development, both, etc.

    Another problem is that the achievement of London schools is not strictly comparable to that of schools elsewhere, because up until recently schools were allowed to enter pupils for GCSE 'equivalents' that were equivalent to GCSEs in name only - in reality they were much easier. Many more non-London schools played this game than London schools.

    So yes - this data is a huge mess and the analyses are frequently farcical. The various measures seem designed to confuse rather than illuminate. Thanks for restoring some order to the chaos.

  4. Dear Andrew, thank for your observations, particularly those regarding GCSE equivalents, which I sidestepped, feeling the post was getting too long. The better approach would be to try to define a core of key GCSEs and then we would have a better grip on real achievement. By the way, I want to get back to the "streaming" debate, and have a half-written post in progress.

  5. Hi James, re twitter will try to explain what I mean.

    Progress in education measures a child's absolute attainment conditional upon their previous score. In the example you gave of the child being in the top 80% at 11 and again at 16 making "no progress" isn't quite right. In the current system children are assessed at 11 on a series of standardised test and given a score, most children get a 4, some do worse and get a 3, some do better and get a 5 and some do even better and get a 6. The progress measure comes in by tracking these same children to GCSE and seeing what they got. Given previous historical examples we basically expect children who get a 4 to continue to demonstrate reasonable academic ability and "progress" through the education system to get around 5 C's at GCSE (they would therefore show "expected progress"). Equally we would expect children who get a level 3 at age 11 to struggle more (although still to "progress" and learn new things), and to probably get Cs and Ds at GCSE. And then we'd expect the level 5 children to do very well, and the level 6 children to do really well. The point is that we fully expect previously low attaining children to struggle at school and high attaining children to learn more. Which I think is relatively uncontroversial, as you say - lots of research demonstrates kids who do well at 11 tend to do well at 16.

    The reason expected progress is used here (and in education) rather than pure attainment is to try and isolate as much as possible the impact a particular school (or area) is having on the attainment of its pupils. So we fully expect a school serving pupils who enter with high attainment at age 11 to get better absolute GCSE scores than a school serving children who enter with low attainment at age 11. But that tells me about the pupils in the intake, and I’m interested in the school and it’s teaching. In particular what we are interested in are schools (or areas) where for some reason children are making much more or much less progress than we would expect. London is one of those areas, poor children in London do way way better than children of similar ethnic and economic backgrounds in other parts of the country – both in terms of absolute attainment and progress. And only 10 years ago London was one of the worst performing areas, so we were seeing significant improvement. Interestingly they also do better if they move into London and worse if they move out. (I really recommend the work of Chris Cook on this) The question for people like me who are trying to figure out if there’s anything to learn from London is why?

  6. Second comment to follow on

    In this paper Burgess sets out to see if London’s much higher performance is down to policies (London Challenge) or due to the different nature of the children these schools serve. He starts with a measure of attainment, (the best 8 GCSE, column one table 1 and 2) with no attempt to control for progress. As you can see children in London (and Birmingham) score above average, so their pure scholastic ability is high. However we know that London primary schools are also very good (from an IFS study briefly mentioned in the paper) and London’s performance might just be due to the fact these primary schools are producing bright kids and the secondary schools aren’t doing anything. So in column 2 he controls for prior attainment of pupils and compares just pupils who began secondary school with the same level of attainment at age 11. Here the effect is even larger, and note it won’t penalise bright kids. The analysis isn’t to see which children overall are “making faster progress” relative to all their peers - its to see which children, who began secondary school at the same level, are making faster progress. This looks great for London, and it gets better when he adds in all the things like birth month, socio economic background and gender that are also correlated with school attainment. Here London goes up again, which is where most analysis usually stops and says London is amazing. The value of Burgess’ analysis is he then filters for ethnicity, noting reasonably that London has a very different population from the rest of England and this might have an impact. Sure enough it does, and the London effect basically disappears. After all the fuss made over London it seems that we haven’t solved any problems, London schools do better because, although they are serving a high number of poor children, a lot of these children are drawn from ethnicities such as Chinese and Indian – and, crudely, their ethnicity basically trumps their poverty. Which I again think you wouldn’t find that controversial (although you might disagree with his attributing this to environmental factors).

    There are three interesting points left to make. Burgess notes that when he focuses just on GCSEs and not “equivalents” (which as Andrew says above, are not equivalent at all) the London effect re-appears. This is actually the table you show in your blog, the reason the Chinese are at the bottom of that is that they have the lowest ratio of taking equivalents to “full” GCSEs. Which is what you’d expect, “full” GCSEs are harder so you’d expect the highest performing group to take more of them. It also re-appears if you focus only on the % of children getting the very highest grades at GCSE. This was my point on twitter – even controlling for ethnicity, gender, month of birth, economic background and the level children started at – London schools still do a better job of getting more pupils to the highest grades than we would expect. And finally there appears to be potential peer effect where white British pupils (the lowest performing group) are doing better in London but their performance cannot be explained by anything London schools are doing and they seem to do worse in areas with less ethnic integration (which is might be why Manchester doesn’t do as well as Birmingham or London. All three cities have quite large percentages of ethnically diverse children, but they are much better integrated in Birmingham and London). Although as Burgess says, this might actually be because the white British people in London are inherently different to the white British people in the rest of the country. So maybe there is something good going on in the capital (or maybe it’s just attracted all the smart people).

    Sorry for such a long post – I hope it’s useful and reasonably clear. Please just let me know if not, or if you feel I’ve made an error anywhere!

  7. Thanks for your very detailed explanation of your argument, which I appreciate. However, I probably haven't explained sufficiently my strong reservations about "controlling for" variables which are themselves contaminated by differences in ability. I will dig out those posts for you, and then come back later.

  8. The presentation of this research in the media implied that London children were actually getting higher GCSE grades than the benighted white-bread provincials (I paraphrase).

    Is this not the case ?

  9. Study UK results before London. Look at population before sub-population. In UK some ethnic groups were getting a higher pass rate than local whites, and likely higher grades. This was already known and predicted from higher intelligence of Chinese. Coverage said "diversity" was good, which obscures the differences in racial groups, some raising and others lowering educational attainment. For London effect, look at Birmingham effect, which is stronger, but conflates the two different meanings of "progress".