It’s surprising how seldom we see a breakdown of the cost of public participation across multiple channels or tactics. These breakdowns give a unique window into community engagement that can be extremely useful to agencies trying to create engagement plans and realistic budgets to meet their outreach goals.
That’s why we were so surprised when the Metro Nashville Planning Commission included a powerful slide breaking down cost per participant in a recent webinar with over 600 people online. The agency employed a staggering 8 different tools and tactics in their community engagement process for the APA award-winning NashvilleNext project. They then took the time to carefully breakdown the average cost to engage participants for each one.
Their community engagement activities included a range of face to face public sessions (events, meetings, focus groups, community conversations, lounges and book-a-planner) and three of the most well-known online tools: Textizen, MindMixer and MetroQuest.
Greg Claxton talking about cost per participant on NashvilleNext
Not surprisingly, the face to face activities were the costliest ranging from $9 – $47 per participant compared with $3 – $9 per participant using the online tools. While it would be good to have comparable data from other projects, there are some surprising things here.
The range in costs within the various face to face approaches is larger than I would have expected. I also would have expected higher costs for public meetings based on what I’ve seen in other projects. The relatively low costs are likely testament to Nashville’s excellent promotional efforts to fill the meetings. The meeting shown above was an outlier. It lower cost was due to a terrific turnout on a hot topic.
While it’s gratifying that MetroQuest came out looking great, it’s important to point out, as Greg Claxton of Metro Nashville Planning Commission does in the video, that each tool or approach can attract a different audience. In our experience the most cost-effective strategy can be to use some of the face to face tactics to target demographic groups that are missing online, thus filling in the gaps.
For the online tools, the results met my expectations but I live in the online engagement world every day. SMS-based tools like Textizen can be particularly effective with demographics that love to text. While the audience size might be more limited, it is certainly can be an important component of a project, especially those seeking to hear from youth. Generally, tools that require participant registration like MindMixer (now called My Sidewalk) attract less participation because of people’s resistance to sign up. Even given those factors the cost per participant was still relatively low for all of the online tools.
I also caught up with Greg Claxton at the APA National Conference. You’ll see several clips of our interview in the short highlight compilation video below.
Highlights from 9 Mini Interviews on the Floor at APA 2016
P.S. Here are some thoughts on why these sorts of cost breakdowns are so rare. Firstly, in order to compare between strategies fairly, agencies need to use a wide variety of different approaches on the same project. With shrinking budgets, that’s hard to pull off. Secondly, someone has to do the work to cost out and compare the results of each activity. Thirdly, and perhaps most importantly, that analysis has to be made public in order for you to learn about it. These projects are team efforts with multiple people, firms and departments attached to each activity and politicians and the public scrutinizing the effort. Given that mix of actors, there are almost always a few who would rather not share the cost-effectiveness details. Hopefully this sort of transparency will become increasingly common.
Hi Dave, I wonder …. cost effectiveness is a bit tricky. If the effect is to inform the public, then the contact cost itself may index a presumed effect. What if we want to draw citizens into development of plans …. I mean specifically for plans that they and their neighbors and friends would have some stake in the outcomes. The output would then be evidence of some specific form of “engagement” beyond being engaged in a public information campaign. I am sure that you have though long and hard about this matter …. how can we be sure that we are being effective when we invest in efforts to engage citizens? Isn’t a good deal of the magic in the process that is used in each generic form of engagement? Is the cost per event worth of the effort? I hope that you can elaborate because I do feel that you are on an important theme.
Hi Tom, thanks for your comment. I’m with you on this. In this short post I didn’t elaborate on the quality differences between these different methods. I think that would be a valuable addition but it may be more suited for a longer form document. Stay tuned! In the meantime, our guidebook elaborates on various was to engage citizens in the development of plans online. In this post called Public Engagement 3.0 I elaborate on a way to combine online and face to face in an efficient way that draws on the best aspects of both. I hope these resources are useful. Thanks for the encouragement to add more on this topic. Will do.
Great post Dave! We should talk about working together 🙂
Alex. He’s a great guy! Would love to hear about a potential partnership! ?