Big data: get it right, and the benefits for policy-making could be huge

"Big data" promises big rewards for government, says Douglas Beal of the Centre for Public Impact. But it must be driven by policy objectives and analysed by staff with the right skills


By Douglas Beal

01 May 2015

There is a room, three levels beneath the East Wing of the White House, where outsiders seldom enter. A world away from the formal reception areas, the space – its official title is the Presidential Emergency Operations Centre – is the bunker where then vice-president Cheney and senior aides retreated on the morning of September 11, 2001.

With president Bush racing west on Air Force One from Florida to first Louisiana and then Nebraska, the room became the nerve centre for the initial response to the attacks. Orders such as grounding 4,000 planes in American airspace and evacuating Congressional leaders ricocheted thick and fast.

What citizens and the media didn't know at the time was that many of these decisions were taken without the support of accurate, up-to-date data and bespoke technology. Telephone systems and video conferences – both in the bunker and 45,000 feet up – proved woefully unreliable. It also later transpired that the terror attacks themselves could have been prevented if the US government had better data aggregation and analysis. The information existed but it wasn’t being shared.


Related articles

UK not dismissing the open contracting data standard, says ODI
More still to do on Open Data, say Mike Bracken and Nigel Shadbolt at ODI Summit
Former chief of Tony Blair's Delivery Unit: use performance data to fight public cynicism


The US, though, has learned from its mistakes. Fast forward 14 years and the Department of Homeland Security – set up in the aftermath of the attacks – has overall responsibility for keeping America safe. The result of a merger of 22 individual agencies, it is the hub for timely information aggregation, analysis and dissemination critical to national security and economic stability.

The US also revamped the operations of the Executive Office of the President (EOP) after the attacks. The EOP is made up of many entities including the director of national intelligence and National Security Council, as well others such as the Office of Management and Budget and Council of Economic Advisors. Working side-by-side, these entities now benefit from clearer, more transparent information collection and analysis that, in turn, leads to stronger policy options and insights for the current and future occupants of the Oval Office.

September 11 is an extreme example, of course, but policymakers have little respite from the constant demand to make decisions that potentially affect millions of people. They are a recurring feature of life in power. Where to spend scarce funds? Which reforms will deliver the strongest public impact? Which new technology can best enhance existing processes? The answers are rarely straightforward.

This ever-shifting terrain demands not only the best and brightest to be attracted into public service – not always simple given the higher salaries usually on offer in the private sector – but also for government to be able to rely on state of the art systems and data that provide policymakers with all the information they require to guide their decision-making processes. It's not that governments are short of data – far from it. On a daily, even hourly, basis, its programs and services result in and need huge amounts of data and other information. The question is how to capture and use it to best effect.

An important starting point is for governments to remember that its objectives are what should drive the design of any new big data system. Technology-driven efforts often fail. Instead, policymakers should be guided by the vision of achieving better and faster decisions through data and information analysis, insights and policy support. In practice, this means garnering data and information on a wide range of topics from a diverse range of sources which is then acquired and analysed in a central location by a team of advisers. It is they who should then make policy recommendations to the prime minister, president or ministries. So, how can they get there?

The pace of technological change means that governments can now develop high level IT architecture that is able to reflect the need to be both scalable and flexible while remaining value for money. One-off systems are now things of the past. Instead, there is now a greater ability to aggregate data from various sources, thereby removing any need to rely on potentially outdated IT systems. There is also now no need to build a super computer for these tasks since the needs and queries will change over time.

Another priority for departments is to develop analytics capabilities that can make use of the huge amount of data that it produces and receives. Organisations can get results even from the analytics of unstructured data using self learning systems that improve the result of the analysis over time. Unfortunately, many public sector organisations find themselves doing ad hoc analysis on specific issues and reliant on poor quality data that has not been integrated and possibly out of date. The focus now should be to step up and embed analytics into strategic and frontline decision making, supported by strong executive level support.

This cultural change places a huge burden on government recruiters, however, as it is difficult to find these skills and continue to develop them over time. For example, the UK over the next 5 years will need 250,000 data scientists and analysts while the estimate is that only 20% of this number will be available – a similar picture exists across the western world.

But it's not all bad news. The UK may be short of key staff but its Government Digital Service (GDS), which implements the government’s “digital by default” strategy, is also seen as a leading pioneer. As part of its efforts, it produces a blog about its progress and openly reports data on service costs and volumes. The GDS publicly tracks its progress against targets and publishes the source code for other governments to use. Talk about transparent.

Such examples show that governments are increasingly grasping the potential offered by Big Data and analytics – and so they should. After all, it offers a huge opportunity for policymakers to optimise policy, programme, and service design. Now that's making public impact.

Share this page