[WEBINAR] Q&A: How to Design Public Engagement to Find Common Ground
We had so many great questions from the webinar that it may take us a few days to answer them all. We will be adding to this blog as we answer more questions, so please feel free to check back to see more Q&A.
Does MetroQuest have case studies and examples of how they have implemented some of these strategies on any commuter rail and railroad projects in the planning phase and with projects under review/approval with FRA? I see that FHWA and other agencies have utilized but curious with usage during railroad planning.
MetroQuest has been used on multiple rail planning projects, from commuter rail to freight-focussed studies. Whether it’s alternative alignments, new investments or policy projects, online engagement can be extremely helpful in these cases since the area of influence of these projects is often so vast that public meetings are unlikely to attract stakeholders from across the region. I would suggest you reach out to Derek Warburton at email@example.com or toll free at +1 855-215-0186 to talk about specific project types and approaches.
What information does MetroQuest survey collects to assess diversity of demographics participation?
MetroQuest always includes questions about demographics. These are placed at the end of what is typically a five-minute experience since we have discovered that most people are more than happy to provide this information after they have participated since they are then engaged in the subject matter and interested in the outcomes. It’s up to the client to decide what information to collect to help them gain confidence in the diversity of the engagement. It’s a best practice to evaluate the demographics of the participation half way through the engagement period to see if there are any gaps. There is then time to use targeted promotions and “go to them” engagement strategies to fill in these gaps to meet the diversity goals of the project.
Where to get more info on MetroQuest tool itself. Applications etc.
Our apologies to those of you who wanted additional information about the MetroQuest itself. Due to the educational nature of this particular webinar we were not able to provide much detail on the software. The best way to learn more would be to attend our next webinar which is specifically about MetroQuest and how to optimize online engagement for the best results. You may also feel free to reach out to Derek Warburton at firstname.lastname@example.org or toll free at +1 855-215-0186 to learn more.
If you have people doing the survey at a public meeting on their phones, have you had any issues with overloading the internet bandwidth?
No, MetroQuest is a relatively light web-based application which loads quickly. Often at public meetings there are iPad type kiosk stations set up at the back or side of the room where people can participate when they wish so only few people participate at once. If the process involves people using their own phones at a meeting some people will be using the mobile network to connect and others may be using the local wifi. Depending on it’s capacity, the local wifi may get bogged down but it’s not a great concern since it’s only likely to be a second or two of delay.
Would you describe the Westport survey as a statistically-valid survey of a certain group? Being an online survey, were participants able to forward the survey link to like-minded people?
Surveys that are open to the public are not like formal market research surveys where a small randomly selected group of people are polled about a topic. Generally government agencies are keen to engage as many people as possible and are required to allow anyone to participate. For these reasons, MetroQuest is optimized to engage the larges number and diversity of participants possible.
With that being said it’s also important for agencies to be confident that a small and vocal minority is not able to skew the results. Yes, people are free to help spread the word no matter which side of the debate they may be on and generally we find that the results of broad engagement closely match the diversity of views in the community and that the vocal minority remains a minority in the results.
In fact in the Westport project they also conducted a statistically-valid telephone survey and the results were consistent with the results of MetroQuest.
MetroQuest has safeguards that prevent abuses like ballot-stuffing (see response to that specific question for details). Clients also receive data on the demographics of participants so they can easily see if there are cohorts that are over or under represented in the participation. Clients can see the results from each cohort isolated and also calibrate the results for the actual make-up of the region.
How do you recognize/reconcile “stuffing the ballot box” by special interest groups when using online polls? How do you protect against coming to conclusions that really don’t reflect the community’s opinions?
Planning processes can be controversial and abuse is always a possibility. MetroQuest has tools and techniques to identify and help mitigate ballot stuffing in order to protect the integrity of the results. It’s very rare but, if it happens, these tools allow potential abuse to be easily recognized and nullified. These tools combine IP tracking and several other data points on user activity. These tools allow clients to be confident that someone isn’t, for example, participating 500 times in an attempt to skew the results.
In addition to these measures, it’s important to ensure that there is a balance of demographics engaged in the process so that one group does not dominate the results. MetroQuest allows clients to track participation by demographic categories and oftentimes midway through an engagement process efforts are made to ensure that the gaps are filled in using targeted promotional techniques.
Generally, with such huge numbers of people being engaged online, special interest groups are not able to skew the results. The same is not true of public meetings where a small and vocal minority are easily able to dominate the dialog. Special care must be taken to avoid this danger if public meeting are a part of your outreach.
What tools did you use to push out the MetroQuest web site and let people know they could take the digital survey?
The BartonPartners team 1) created a series of 8 email blasts to the 4000+ emails to encourage participation in the digital survey 2) the Town of Westport posted articles on their website 3) three posters were displayed on signs at the train station platform 4) those who visited the website added 1004 new email addresses 5) those who took the survey shared it on Twitter Facebook email and other blogs 6) we encouraged participation in the survey at many of the recorded monthly steering committee meetings.
How are you able to be inclusive of those that read different languages in the survey?
Although it is an available service through MetroQuest the client did not request the survey in multiple languages.
How did you define the intent of your public engagement process and gain support from leadership to implement?
The purpose of the public engagement process was introduced in our proposal for professional services discussed at multiple steering committee meetings and presented at the Planning & Zoning Commission meeting.
How specifically did Metroquest help on this project? Other innovative tactics?
MetroQuest assisted our team with 1) best practices for public engagement outreach to get people to take the survey 2) recommendations for screens that will help accomplish our goals 3) technical coordination of the survey content and 4) assistance in the analysis of the survey results.
Westport: Frankly it looks like there’s way too much surface parking remaining to be a robust TOD plan. Did you cave in too much to NIMBY people? I did not understand how common ground was achieved.
Although it may not have been clear on the graphics approximately 1/3 of the proposed 1600 parking spaces are structured parking spaces. At present none of the spaces are structured.
What methods did you use to ensure responses were representative of the community?
Upon learning that we had fewer youth participate in the survey than expected during the first month some of the committee members made an extra effort to hear from young people.
Can you talk more about the difference between the Focus Groups and the Steering Committee and how both groups’ input was used (decision making? recommendations? Other?)? How do people get onto the survey on their cellphones? Is it a link? If so how do they get they link? Second question: if you have people doing the survey at a public meeting on their phones have you had any issues with overloading the internet bandwidth?
The eight focus groups were gathered based upon recommendations of the Steering Committee to represent property owners near neighbors etc. We created business cards with the project website and survey URLs and distributed them at the open house meeting. We shared the focus group findings survey findings and open house findings with the Steering Committee at various meetings.
Did the content you collected through the survey match up with the collection of data at the open house?
Largely they were consistent however the questions asked at the open house were different than the survey. The open house forum offered more maps street cross sections and different types of illustrations than the survey.
Sometimes massive effort to increase transparency can lead to more objections or inflated expectations about the decisions to be made by stakeholders. Sometimes decisions can’t be made by large committees or public surveys. They’re too complex. Do you wish you limited the scope of any decisions to be made in any way?
Yes it would have been helpful to have a smaller steering committee to better address the more complex topics. We attempted to present less nuanced policy questions but were often drawn into more detailed decision-making than our meeting time permitted.
Can they give some examples of conflicting desires from stakeholders and how they resolved these differences? Are the images in the community preference survey local examples?
Stakeholders debated the merits of 1) maximizing parking vs. allowing for mixed-use development 2) structured parking v. surface parking, 3) during periods of congestion on the adjacent interstate highway how to best mitigate cut through traffic while allowing local access to roads and 4)permitting buildings that are tall enough to be economically viable