Published on October 13, 2015

Digital Public Engagement – Lessons from the Sounding Board

Citizens & science Engage2020 Sciencewise

By Reema Patel

Reema is a Policy Analyst at Involve, working on the Citizens and Science programme - including Sciencewise, the expert national resource centre for public dialogue input into science and technology policy.

Last week the Involve team at Sciencewise ran their first pilot of the Sounding Board, an online discussion panel of 20-24 people feeding into a complex science and technology issue. The pilot engaged a group of people of 23 participants from a broad range of age groups, geographical locations and educational backgrounds. The whole Sounding Board process, including scoping out policy questions, engaging members of the public and reporting on results, will be completed over a period of 6-8 weeks. Our hope is that this streamlined engagement process will make the Sounding Board a useful, responsive and cost-effective tool for policymakers who are seeking early and rapid public input into their policy decisions.

How did we design the Sounding Board platform?

 3842212483_ce9267cd89_b

Design of the Sounding Board was an iterative process, through which we tested the platform with colleagues, and with our friends and family. We discovered that there was nothing that moved its development along more than just putting it out there for other people to pick apart – allowing issues and problems to be identified that could be resolved down the road. One of the most important lessons from the process for innovators within public policy was the importance of creating a safe space that encourages innovation, and being able to accept the possibility of failure as a stepping stone to success.

How did we run deliberative sessions online?

All the participants had the opportunity to familiarise themselves with the online platform before our three deliberative sessions. This gave the team the opportunity to identify and iron out technical issues, as well as to offer tailored support. We gave participants information about the policy issues in advance of the sessions, and they also had a chance to ask policymakers clarifying questions. We split our 23 participants into three cohorts (of 6-10 participants, plus a policy maker and a scientist) so that they would be able to discuss issues with each other and with the policymakers and scientists in depth over a period of 90 minutes.

How did it go?

Strong and structured facilitation led to a rich discussion of the issues and initial feedback from participants and policymakers has been positive – and there were no major technical hitches. The Sounding Board platform also allowed us to integrate polling questions with discussion and record the discussion easily.

There’s been longstanding debate about whether online deliberation works, and how it can complement face-to-face dialogue. Sciencewise has dealt with these issues extensively, exploring them in the past within the report ‘In the Goldfish Bowl : Science and Technology Dialogues in the Digital World.

There is an element of trust and mutuality that exists with face-to-face interactions that is unlikely to ever be fully replicated online, but we should also recognise the barriers that many individuals face navigating the built environment, and the potential that digital platforms have for resolving these. For my part, ensuring that geographically dispersed members of the public were able to have deliberative conversations across regional borders stood out as an experience and is part of the unique offer that digital platforms have to public engagement.

Was the deliberation effective?

During this trial, participants engaged with issues and with policymakers in much the same way as they might in a facetoface workshop. In particular, I felt that some participants were more forthcoming about their views than they may otherwise may have been in a face-to-face setting, which is encouraging, though more research would be needed to see whether my impression is true or not.

One test of whether deliberation has worked effectively is to look at the difference between participants’ views before and after deliberation, to see if there was a difference. We took care to ensure that we were able to firstly, poll participants on their ‘top of the head’ responses, and then secondly poll people post-deliberation. We found that there was, indeed some substantial difference in their ‘informed’ responses – and made sure there was time at the reflections stage for exploring how and why those views had changed.

Evaluation

The Sounding Board is being evaluated, as with all Sciencewise dialogues – and the evaluation phase of our pilot project is currently underway. As with any other workshop, we’re collecting feedback and responses from participants and we’re looking forward to honing and improving the tool.

Outputs and outcomes

As ever, at this stage – whilst it’s possible to measure the quality of the Sounding Board process and approach, it’s not yet possible to measure the entirety of the impact it has had on the policymaking process. Alongside the intrinsic value of the process for all those involved – challenging assumptions and perceptions there were moments in the deliberation where policymakers acknowledged that points raised by the public would be taken away for further considerationa key indicator of the value of the tool for more effective decision making. We’re looking forward to finalising the report and outputs so that the feedback from participants shapes government decision-making and thinking on the questions posed.

Piloting the way forward

This was a pilot proof of concept. If the evaluation concludes that this is a cost effective way of feeding in high quality information about public perspectives then we will explore different ways in which the service could be provided in the future.

This blog was originally posted on the Sciencewise website. You can find it here.

2 Responses to “Digital Public Engagement – Lessons from the Sounding Board”

  1. October 30, 2015 at 11:30 am

    As organisations struggle to find solutions to complex problems, policy makers look for tools that can capture complexity and elicit insight(1). To what extent can sense-making tools like SenseMaker(2) inform adaptation policy?

    1. http://www.ecologyandsociety.org/vol20/iss1/art66/
    2. http://eu.sensemaker-suite.com/

  2. Reema Patel
    Reema Patel
    November 4, 2015 at 10:16 am

    Fascinating question. I think being able to marry qualitative input with quantitative data would be an incredibly useful method (if comprehensive enough) within a mixed methods research approach, and would be likely to complement deliberation incredibly well.

Leave a Reply