Opinion: How best to put people in boxes

No-one likes to be stereotyped, but using employee engagement surveys to categorise and understand employees makes sense, argues Jonny Gifford of the Chartered Institute of Personnel and Development (CIPD).


By Civil Service World

06 Jan 2014

For decades now, the primary method of collecting insight into employees’ working lives has stayed the same: write a questionnaire, send it out, prod people to complete it, collect the results, a bit more prodding, crunch the numbers and pull it together into a story that makes sense (or one that you want to hear). Then, as with all research, you either bury it in a filing cabinet somewhere or actively disseminate the findings, to celebrate successes, and galvanise and focus efforts to make improvements.

The only real change in self-completion surveys is that they have gone online, making them cheaper, easier and in some ways more reliable (randomisation of questions can help avoid bias). But it still amounts to working your way through a raft of narrowly defined questions, selecting each time from a predefined bunch of options. Literally, if not figuratively, a tick-box exercise.

Is this a tried-and-tested method or one that’s due for replacement? There are at last some significant methodological developments taking place, but I’d argue that many of the frustrations with self-completion surveys are based on misunderstanding, poorly designed tools, or incomplete reporting.

A common misunderstanding is that the decision to use qualitative or quantitative research methods is little more than a question of how you like to see your data, your preference for numbers or richer descriptions. What’s easily missed is that qualitative and quantitative methods do different things for different but complementary jobs, typically the former being used for theory building and the latter for theory testing. In tandem, the two form a cycle in social research.

Take qualitative research first – the likes of in-depth interviews, focus groups, written comments and even observation. You need to be systematic in your analysis – it's not completely unstructured – but this sort of data informs thinking by giving a fuller understanding of the ins and outs, the reasons why, the experiences, the clarity, the confusion, the different world views.

The strength of this is that you can be relatively sure about the validity of your interpretations – the causality, the reading of the data. The weakness is that, if you don't get data from a representative sample of the population you're interested in, you don't know how widely your findings hold true.

This is where quantitative research comes in, the strength of which lies in the fact that it's more precise, structured and, done well, representative of the population(s) of interest – all civil service employees, for example, or staff in certain departments, in certain functions, at certain grades or from certain ethnic groups. If you are interested in any groups, you will at some point want to count them against measurements, to see how big the challenge (or opportunity) is, what you should prioritise, where you should focus your energies.

If your starting point is robust qualitative research and you transfer the insights, no matter that you’re giving employees boxes to tick, the boxes you present will serve you well.

Anyone can knock up a questionnaire and judging by some efforts, they frequently do. It is easy to think you know what the important questions are and how they might be answered, but if a survey tool constrains people artificially, it gives false precision and a picture that is unreliable or skewed in ways unknown. It is also easy to disregard the necessary craft of designing a good questionnaire, to write split questions, leading questions, or give ambiguous response options.

This is where frustration originates, in poorly designed surveys. It carries through into suspicion about results, making it even more important that survey results are explained properly. But good survey research still has a valuable role to play.

Nonetheless, there are two big advances ready to be embraced. Firstly, the advent of social media in organisations means that, rather than watch their views disappear into the ether and wait for a response, employees can interact with each other at the same time as feeding their views upwards. They can see how their views land with colleagues and read about other people’s experiences and ideas. Communication is not two-way but multi-directional.

This becomes a particularly valuable source of data when it is teamed up with another development, namely our ability to analyse unstructured data. Whether through software that mines written comments, or through online functionalities that allows staff to rate each others’ comments, we are better placed to make the crucial step of aggregating qualitative data.

Our latest research on social technology – conducted, yes, through a survey –suggests that these new methods have yet to take hold in UK workplaces.

But even where they are used, they do not negate the role of survey research. You may get great insight into bottom-up employee ideas and from seeing how other employees react to these, but this won’t tell you what the ‘lurkers’ think; staff whose voice is absent. We will always need representative, precise and reliable data. Not only can we put people into boxes, it’s a necessity. Just not all the time.

 

 

Read the most recent articles written by Civil Service World - 'What keeps you awake at night?': A guide to the government risk management profession

Categories

Analysis
Share this page
Read next