This is the final post in a series of three based on a talk I gave at the Institute for Government at their Crowdsourcing Policy event, 23rd January 2012.
The second post explored four ways that government could crowdsource policy. I’m going to take that as a leaping-off point to highlight a number of principles which government should take into account as it considers whether or not to open policy to the crowd. I didn’t have time to highlight these issues during my talk, and so this post is an extension of my presentation.
I’ll continue to use the jelly bean analogy I began in the second post.
My first use for crowdsourcing was tapping into the wisdom of the crowd which I illustrated by inviting 90 people at the event to guess how many beans there were in the bag of beans pictured at the top of the previous post. Intuitively one might expect that if the crowd is larger, then the answer will be more accurate. Opening up my call for guesses to my twitter followers would be one way to increase the size of my crowd. However, my followers can’t see the size of the bean in relation to the size of the bag, their information is less reliable and my larger sample will find it harder to give the right answer.
This is a trivial example, well known to many people. However, I don’t think the lessons it offers have been fully learnt by government. For example, the government’s Spending Challenge invited first civil servants and then the public to identify wasted public spending. I would contend that the vast majority of the public don’t interact enough with public services to be able to identify waste effectively. The sheer weight of comments which didn’t identify waste in the way I understand the government to have meant confirms this.
This leads me to my first principle.
Principle 1: have a clear understanding of which crowd it is you need to engage with.
It is worth noting that Jeff Howe’s definition is an open call. I just think that when considering crowdsourcing policy his definition needs adapting, because in policy terms, not all crowds are equal.
My second use for crowdsourcing was for collecting information that is highly distributed within a population. I used the example of the the flood prediction map for the Philippine capital Manila which used crowdsourcing to identify flood water heights after a devastating typhoon in 2009.
A flood predication map can be used for many different things, to warn citizens as waters begin to rise, to develop better flood defences and invest in improving infrastructure. And it is in this spending of money that government has power of course. The more important the policy area, the more likely it is that there will be stakeholders with competing interests. The more money there is at stake, the worse the problem will be. Any system can be gamed to try to put one group at an advantage over another – whether it be for infrastructure upgrades, or investments in education, for example. Where government does decide to embark on this kind of crowdsourcing exercise, it will be important that the platform used allows for error checking and does all it can to spot and prevent gaming.
Principle 2: Ensure that you reach out to excluded groups and that you minimise gaming of the process.
The third use I highlighted for crowdsourcing was to help identify the preferences of the crowd. I illustrated this using my trusty jelly bean and noting that the crowd could be asked to identify their favourite colour. This same kind of process could be used to identify the preferences of a local community for the opening hours for a doctors’ surgery.
However, there’s a methodological challenge, which leads me to my third principle.
Asking the simple question, ‘which colour is your favourite’ gives you the clear result that red and blue are neck-and-neck in popularity way above the rest. So you increase the proportion of red and blue beans in the sample and sit back, ready to rake-in the cash. But sales plummet. What you didn’t know was that people who like red beans, hate blue ones and visa versa, because you didn’t ask that question.
Principle 3: Understand the way you have framed the question and the implications of the questions you are asking
I think there’s a second methodological challenge relating to this which highlights two more principles.
Continuing my increasingly tenuous jelly bean analogy into the realms of fantasy, it turns out that the people who hated the red beans were parents who think that the dye used turns their kids hyperactive and aggressive. Parents with money are the most important market and getting rid of the red beans would see sales sky rocket. Who cares about the red lovers? They’ve got no money and rarely buy jelly beans.
In the world of commerce this sort of decision is unremarkable. However, in the policy world it’s different. Take the case of the doctors’ surgery opening hours. Further research discovers that the minority of people who want the surgery open at an awkward hour are the same people who also cost the health service the most money when their long term conditions get worse if not picked-up by the GP.
Principle 4: be clear about what other data you need in order to understand the information you are collecting.
But this highlights a challenge for government: the most efficient use of public money would be to go against the crowdsourced timings and cater for the minority who cost the public purse most. However, these are unlikely to be the voices which will be loudest if it becomes more difficult to get the yellow fever jab for the beach holiday in that exotic resort.
Crowdsourcing might help you to identify preferences and issues you hadn’t been aware of before, but it can’t be used to develop a political consensus – other techniques must be used, from simple political leadership through to large deliberative assemblies. *
This leaves the politician in a position where the crowdsourcing hasn’t answered the question they thought they set out to answer, but has instead raised a significant political problem.
This is a political problem which will be compounded if the crowdsourcing exercise was framed wrongly, or without sufficient thought. Voting is a common way to identify and surface people’s preferences, for example.
However, it is reasonable to assume that the people taking part will expect to see the option with the most votes winning, being one of the times that the surgery was opened. While simple changes such as changing the choice of words, using ranking rather than voting, would make a difference.
This leads me to my fifth principle.
Principle 5: Don’t raise expectations beyond what you can deliver on
In practice what this means is be very clear with yourself, and then the crowd, what other factors you will be taking into account – cost of different user groups to health service, for example – and the weight you will give the results of the crowdsourcing in relation to the other evidence. Ensure you only go ahead if you are confident that citizens’ views will have a significant enough impact on the final decision.
My jelly bean crowdsourcing was developed by a marketing wizzkid, who sold it internally as a way of raising sales. He found out that while parents don’t like red beans, but kids really love green beans. He spent millions on a funky marketing campaign with dancing beans and a catchy slogan. He then went back to the factory and said make me three times as many green beans. However, the head of bean manufacture said that for various technical reasons this wasn’t possible. The marketing wizz was left with customers who had bought into the green bean strategy, had had their hopes raised and now hated the brand that couldn’t deliver.
This increasingly tenuous analogy has a real world counterpart though. The public gave comments in their tens of thousands to the Spending Challenge, but government departments took very few ideas from the public on board. I’d argue that this is because the exercise was framed in such a way that Whitehall departments couldn’t use the results and because they weren’t bought into the process in the first place – the Spending Challenge was something cooked up between the Treasury and Number 10, with little engagement with the rest of Whitehall.
As I pointed out a fortnight ago on this blog, our Pathways through Participation research shows that bad consultation puts people off engaging with government again, it reduces trust in government, and worse it undermines wider community activity too.
Which leads me to my next two principles which government should abide by as it engages citizens, whether in a crowdsourcing exercise or in other ways.
Principle 6: Involve those who are critical to implementing the decision
My final, meta-principle is one that I spend a lot of my time saying to government, it’s a hard message for government to hear in this relentless 24 hour media cycle. It’s this: SLOW DOWN!
Principle 7: spend time and money up front, framing the problem, understanding how and when you are going to take the decision and the realistic impact that the public’s view will have, and in getting all of the key players within government bought into the exercise.