In the past few months, the public engagement community has ramped up dialogue about how to measure success. It’s a tricky question since metrics like miles per gallon are hard to find in the “social sciences.” A great place to begin in any such exercise is to clearly define what success means so we decided to do some research.
Once a researcher, always a researcher I suppose. Our software is the result of a university research project, and to this day we still use research to guide us. To address this question we spoke with planners and public involvement personnel at 150 government agencies throughout the US and Canada. Here are the top 4 answers to the question: “What does better public engagement mean to your agency?” Two of the answers related to quantity and two related to the quality of public engagement.
1. Quantity: Engaging more people
A common source of stress for agency staff is getting up in front of council to report on the results of their public outreach process and being asked, “How many people did you hear from?”
For public agencies with political leaders, the total number of citizens engaged is top of mind. Let’s face it – it’s difficult to put much faith in what 50 people said about a city-wide plan. Online community engagement platforms are helping to reach more people than ever before. Using our own projects as examples, the typical project that uses MetroQuest engages several thousand participants – many engage over 10,000.
2. Quantity: Engaging a broader demographic
The desire to engage people of all ages, races, national origins or incomes is a top priority with government agencies. When we look at the narrow demographic of people who frequent public meetings, it’s easy to understand why. There are several powerful strategies to increase the breadth of community engagement. We are currently working on a guidebook on this topic so stay tuned for its release in early 2016.
3. Quality: Collecting informed input
Quantity without quality is meaningless when it comes to public engagement. Planning agencies are dealing with complex topics. Yet, many of the latest public engagement strategies use overly simplistic activities to gather input. Multiple-choice survey questions without the proper context won’t lead to informed input from participants.
The agencies we talked to were looking for new ways to build education into their public engagement activities to ensure that the input being gathered is as informed as possible. At the same time they also wanted it to be fast and fun to avoid scaring people off so it’s challenging.
For example, when polling the public about planning alternatives people should have a variety of ways to evaluate the options. They should be able to see what they look like and how they perform on various metrics before they weigh in with their opinions. Using tools like visual preference surveys and scenario comparisons can be an effective way of educating people on alternatives while keeping the experience fun and easy to follow.
The strength of traditional face to face public meetings is that people are there long enough to learn about the issues in depth. That allows these participants to provide informed input. With more engagement going online, it’s important to include educational aspects into the process. There are a variety of specialized screens in MetroQuest that combine education with engagement as well as other online tools dealing with participatory budgeting that provide good examples.
4. Quality: Collecting actionable input
Agencies are calling for public input that can be more easily used to support decisions. Typically the “product” out of public engagement activities that is used to influence decision making is not raw input. It is more likely the summary graphs and conclusions based on trends and patterns observed based on the analysis of the results. The nature of the end “product” is often not considered as carefully as it should be in the design of public engagement activities.
Public agencies seeking to be viewed as open to new ideas are tempted to ask open-ended questions. The problem is – how do you use hundreds or thousands of open-ended answers? It’s difficult to summarize text input responses in reports and reach strong and defensible conclusions.
For this reason, engagement activities with quantifiable results are favored over open-ended input options. Better methods include asking people to rank or rate options and priorities. Quantitative inputs from exercises like allocating budgets to different categories lead to actionable data. These types of inputs are easier to compare across participants. When combined with demographic information, it’s easy for project staff to report on the trends and opinions of the community.
In a nutshell
We have summarized these four findings into one sentence:
“The best community engagement collects informed and actionable input from a large and diverse group of participants to inform decision making.”
You’ll notice that an additional criteria was added as a result of speaking with experts from the International Association for Public Participation and the National Coalition of Dialogue and Deliberation. It adds that the best public participation has a measurable and demonstrable impact on decision making. Together these criteria provide a framework for defining how to design effective public participation processes. They also form a basis for how to develop metrics to measure success.
What did we miss?
During our research we heard many goals for improving public engagement. While we’ve listed the four most frequently heard goals, we’d love to hear others from you. Have we missed a goal that’s top of mind for you?
David, this is a strong statement for community engagement on the “supply side” of decision making, but doesn’t address community engagement in the task of design of alternative options for action nor does the definition identify explicit criteria for continually improving the engagement process. I would be interested in your thoughts in these directions, too. Cheers, Tom
Thanks for your follow up questions Tom. I think the criteria listed apply to all phases of community engagement, even the early stages that sometimes involve design of alternatives. The design tasks are more challenging to do effectively with online tools since those tasks are typically more time consuming and benefit from collaboration with peers and guidance from the project team. You might enjoy an earlier post where I lay out a strategy for dovetailing high tech with high touch at different stages of the process: https://metroquest.com/public-engagement-3-0/.
I think the framework points to criteria that could be used to monitor and evaluate the success of engagement processes objectively. I think the practice of monitoring and evaluation given a stable set of criteria would go a long way towards encouraging continuous improvement.