By CivilServiceWorld

14 Nov 2012

Vast quantities of information are generated each second, but how can policymakers exploit ‘big data’ to inform their decisions? Ben Willis watched as experts from the worlds of academia, IT and policy tried to make sense of it all


The world is drowning in digital information – and the phenomenon even has a name: ‘big data’. Last year, a report by the McKinsey Global Institute laid bare the scale of the data boom: 30bn pieces of content are shared on Facebook every month; 13 hours of content are uploaded onto YouTube every minute; 2.2bn terabytes of new data are created every day; 90 per cent of the data in the world was created in the two years preceding the publication of the report.

For policymakers, this glut of data represents both opportunities and challenges: opportunities because it can help them to better understand what citizens want, enabling them to shape more effective policies; challenges because there is simply so much data out there that making sense of it is technically complex.

Last month, Civil Service World and IT services firm HP co-hosted a seminar to examine the opportunities and challenges involved in exploiting big data. With presentations and discussion from leading experts from the worlds of IT, policy and academia, the event heard how technology is being developed to begin making sense of vast quantities of consumer data now available, and how policymakers can start using this information to inform their decisions.

Making sense of the data explosion
Bill Saumarez, vice president of data analysis firm Autonomy, part of the HP stable, outlined the challenges the big data explosion presents. The problem with the sheer volume and variety of data now available, he said, is that 90 per cent of it is so-called “unstructured” electronic information: qualitative data such as the comments made on social media sites and other internet forums.

Because most computer systems are built to analyse structured, quantitative data such as that found in databases, they cannot make sense of the subtleties that give meaning to the content of unstructured data, particularly that emanating from social media. Until recently, this has made the data all but unusable.

“Social media is a major challenge – and it’s not going away,” said Saumarez. “If I’m a doctor, the word ‘sick’ has a different meaning than if I’m a rapper. The core challenge is how we understand and interpret this data. Attempts to hammer the world flat by normalising [for example, using tagging and keywords] the data does not account for the shades of grey.”

But Saumarez said technological developments are now emerging that allow unstructured information to be analysed. Advances in “meaning-based computing” now enable the real-time, contextual understanding of all data, structured and unstructured, using pattern-matching technologies to recognise concepts and trends. “Only by understanding the meaning of information can businesses glean actionable insight from big data and capitalise on it by acting in real time,” Saumarez said.

Policy applications
But what value does the exploitation of big data have? According to Mark Perrett, business consultant at HP Enterprise Services, the mining of data – particularly from social media – has great potential to offer insight into what people want. “What you get from social media is emotion,” he said. “Getting hold of that can be incredibly powerful from a customer insight perspective.”

Meanwhile, Professor Philip Treleaven, director of the Financial Computing Centre at University College London, said there is growing demand among businesses and government agencies to use data for modelling the likely outcome of business or policy decisions. His centre works in conjunction with organisations in the worlds of finance, retail and government, among others, to develop new computer systems for analysing data.

“Banks, retailers and government [are] collecting huge amounts of data, and then doing modelling or running simulations on it, to see what patterns there are in it. So what we’re moving towards are these experimental facilities, like digital wind tunnels, where you’ve got huge amounts of financial data, economic and social data, and then either the banks or government could come in and run simulations to look at what would happen if we drop the VAT rate by 5 per cent, for example. That’s the way things are going,” Treleaven said.

Cultural changes
But making better use of big data in government is about more than simply developing the right technologies. According to Julian McCrae, director of research at the Institute for Government, Whitehall’s “poor” use of data to drive decision-making is reflective of a lack of interest in this way of working among the higher echelons of the civil service.

McCrae cited a recent IfG study looking at Whitehall’s use of management information to improve public spending decisions and drive up value for money. The study found that use of such tools across government is inadequate; and rather than a symptom of systems or technology deficiencies, McCrae said, this is the result of poor leadership.

“The conclusion came back to leadership. This wasn’t an information issue, it wasn’t a systems issues. It was about what leaders were focused on,” McCrae said. “So you had a symptom – poor management information and its use in the public sector – which was [caused by] a set of leaders on the official side in the department not being particularly interested in asking questions about benchmarking. What do we know about what’s driving the value that’s being added in our business? How can we put some numbers on that? How do we know if what we’re doing is good or bad?”. McCrae believes that the culture of asking such questions is “really weak inside the top leadership in Whitehall”.

“I can’t prove it, but if you had a quick glance at leadership teams in Whitehall, you’d probably find they’re very similar: people who’ve spent a long time in policymaking,” he continued, arguing that political imperatives frequently outgun the principle of building policy around a strong evidence base. When he worked in Whitehall policymaking, he said, “it seemed to me we did things incredibly quickly if ministers were really interested in them, and we’d cut any corners necessary to make sure we got the right ministerial outcome, when those corners were: how would you run a rigorous, properly evidence-based policy machine?”

But despite the consensus among contributors to the seminar that there is a clear role and need for better use of data in policymaking, Mark Perrett pointed out that big data is only one more tool in the policymaker’s toolbox.

“The way I see big data helping is to give us better insight into what might be happening,” he said. “But at the same time it’s maths, statistics, probability – so if something has a 90 per cent probability of happening, it also has a 10 per cent probability of not happening. That’s where some of the human factors are going to stay with us, and it’s going to be down to gut feeling at the end of the analysis. We can only use data to give us a better guess.”

Read the most recent articles written by CivilServiceWorld - Bid to block whistleblower’s access to ministers

Categories

Policy
Share this page