Last week, we learned that the estimate of net migration for international (non-EU) students, of almost 100,000, was not a robust statistic.
New exit check data found that the number of student visa overstayers was something like 4,600. While the higher estimate included people who had other legal rights to remain as well as those who did not leave when their visa expired, these figures led the government to commission a review to get more information.
Bad numbers lead to bad policy, and the migration figures have fed into a variety of debates including that about the biggest policy issue of our time: Brexit.
A key lesson from this situation is that it is important to acknowledge when policy is looking to imperfect statistical sources. Politicians, policymakers and statisticians need to work together to address poor quality data. An interesting contrasting case study is around how we measure the economy. Economic statisticians are grappling with the question of how to measure the value of services that are provided for free in our new digital economy. You may watch a video for free on YouTube, use a free Gmail account for email and Facebook does not charge you for its social media feed, but some economic value has been created. Statisticians are struggling with how to measure that. And it has big implications: if we can’t capture prices and inflation properly, it makes it harder to set interest rates.
The difference in this area is that the UK has been proactive in grappling with these questions. A review of economic statistics was commissioned by the Treasury and Sir Charlie Bean’s report indicated the wide range of issues that needed thinking about. More money was found for the Office for National Statistics to innovate in this area, which has led to a variety of initiatives including the ONS economics centre for excellence, a data science campus, and an ONS / Royal Statistical Society / Royal Economic Society working group on economic statistics.
Some of the reaction to the migration statistics fiasco has blamed the ONS. But this is unfair. Statisticians have long pointed out that the International Passenger Survey which came up with the 100,000 figure was not a good measure of migration and was originally in place to measure tourism. Sir Andrew Dilnot, then chair of the UK Statistics Authority, argued in a letter to the Public Administration Select Committee (PASC) in 2013 that that there were two options with regard to migration statistics: either the provision of government funding for further work, including the development of new or additional sources, or acceptance that the UK might not be able adequately to produce the migration statistics that might be wanted by government.
There was little excuse for dodging this issue. It has been clear for several years that migration was a key topic of public interest. PASC was scathing in its report on migration statistics, with its chair Bernard Jenkin arguing: ‘Some would say that successive governments have hardly been trying to fix this – they didn't want people to know the truth… Given the importance of immigration as a potentially explosive issue, this ought to be given a much higher priority."
It has taken a long time for policymakers to respond to calls for better data, given the pressing nature of the public debate on migration. The exit check data is still experimental. The Home Office should not be surprised that commentators are taking a dim view of its behaviour, and that some argue that ideology overrode evidence in its policymaking in the way that they presented the International Passenger Survey data as more robust than it was.
So the lesson is that government should be anticipating our need for information on the big public issues of the day, and investing in the statistical information we need well before they blow up on the front page of the press.
The good news is that so called ‘big data’ world we live in does mean there are many interesting new data sources available to explore. The two most important are digital data streams (e.g. phone data, scanner data), and government administrative data (e.g. benefit or tax records or indeed the exit checks data used to get the new figure on overstaying students). The Royal Statistical Society has campaigned, through our Data Manifesto, for greater use of these data sources for statistical purposes. We pushed for greater powers of data access for the ONS in the Digital Economy Act, which it has received. Therefore the ONS is now well placed to use these new data streams to cast light on the biggest issues of our day.
The interesting question policymakers and statisticians should be debating is what are the future areas where the public will want quality information to inform national debate, but where our statistics are not yet good enough?