top of page

Ignore best practices - how to craft feedback questionnaires that work

One of the most popular forms of citizen participation is asking for feedback and ideas. The administration wants to know what people think and in the best case incorporate that feedback into the given project.

There are different ways to gather feedback from citizens - may it be Town Hall meetings, workshops or the possibility to comment on plans. The most used format however is the questionnaire. Whenever a city or an organization wants to know what people think or want, they put out a questionnaire. It makes sense to use this format since it can be disseminated widely and that the results can be compared relatively easily.

When feedback questionnaires are designed, the questions are usually pretty straightforward. You want to know what a person thinks about something? You ask: what do you think about that? You want to know what a person wants? You ask: what do you want?

Sounds logical. But the reality is,it doesn’t make sense in many cases.

How to not ask for ideas

Let’s take one of our latest projects. A city that we are working in has a problem. They are spending a lot of money on their culture. To be more concrete, they sustain theaters, galleries, concert venues and so forth. But they observed that the audience is growing older and some offerings didn’t draw the attention they were hoping for. They felt that it’s time to readjust the cultural offerings. But how? This is where the participation came into play. It makes sense since in this case, the citizen is also the customer. The city wanted to know how the current cultural offerings are perceived and they wanted suggestions on what to change.

The natural way was to design a questionnaire. In the beginning the questionnaire looked a little bit like this:

  • Question 1 - How do you perceive the culture in our city?

  • Question 2 - Which ones of these offerings do you know? [Offerings to choose from]

  • Question 3 - How can we enhance the city's cultural offering?

We had around 15 very straightforward questions. It was “best practice” - just as any other city would do an online questionnaire.

Then in one meeting, one person dared to ask a question. “Guys, do you know that famous quote of Henry Ford: “If I had asked people what they wanted, they would have said faster horses? So what do we really expect if we ask questions like that?”

This sparked a very productive conversation. Sure enough we knew that we wanted to work with the citizens' feedback. But the way the questions were asked, they were bound to generate generic answers that probably would miss the aspects that really needed to be changed.

The reason is clear. No one is as deep in the cultural offerings as the public servants working with it every day. And while outside perspectives are important, the concrete suggestions that a regular citizen can give are limited. They are not experts on the topic. Sure, their feedback on ideas can be very valuable due to their unique perspective. But to demand that they come up with ideas that the experts neve thought of, is pretty optimistic. Therefore the chances are very slim that among these suggestions are some that are on the one hand realistic and on the other not already thought about within the administration. We needed to come up with questions that would provide us with feedback that can be used in a more effective way.

A tech-approach for a questionnaire on culture

So we turned to the ones that are very good in asking feedback questions - our tech team. It is their job to constantly enhance our citizen participation platform. And next to our own product vision, the feedback of users and customers is the central guidance for future development. Therefore our Tech Team conducts as many user interviews as possible.

However, they never ask what features a customer wants. They try to understand the user's behavior and its key needs and develop features accordingly.

We wanted to use that experience. Therefore we invited our Tech Team to develop the questionnaire together with the city and our engagement experts. The result looked very different from the first draft. The city used questions like:

  • What was the best cultural experience you had in the last year (in the city or somewhere else) and why?

  • What do you expect when you attend a cultural event?

  • When you’re at home - what do you enjoy most reading, watching or listening to?

The questions weren’t as straightforward anymore but it went into getting to know the citizens, what they like and what they don’t like. How they behave and what is important to them.

Any expert in participation might say now that this is not really a new concept since design thinking concepts are getting used more and more in offline participation. This is true. In offline participation (with respective budget) this is done here and there. But when it comes to online questionnaires, this concept is definitely underused.

The questionnaire was not the only format we implemented. In order to allow more bottom up participation and to detect blind spots, the city also opened an idea box on their Civocracy project page.

Would the new approach be successful?

As the launch day approached, we were getting a little nervous. Could this kind of questionnaire really work? What if the people perceived it as intrusive and no one would actually fill out the questionnaire? The city was putting out press releases and even started a social media advertising campaign but we still didn’t know if that would be enough. It was frightening and exciting at the same time.

On the first day we found ourselves constantly refreshing the page in order to see if people would actually participate.

It turned out that our worries were unjustified. We got almost 8.000 answers to our questions. The completion rate was lower than usual, meaning more people did not answer all the questions of the questionnaire. However with these types of questions this had to be expected. It was not a quick and easy survey. It was in-depth and this is always demanding, which naturally leads to more people not completing the questionnaire.

On the other hand the answers that we received provide so much more quality than usual. Sure enough the analysis part of this approach is a lot more difficult. But isn’t this because you really incorporate the feedback of the people instead of just asking for the sake of having asked? For us it is clear: we would choose this route on any given day again.

When we did our meeting at the end of the project we came up with one central lesson learned that we believe hold true to many engagement projects out there

Don’t trust best practices

If we would have done what everyone does, we would have gotten the same limited results that everyone gets with the same limited impact. We’re glad that we didn’t follow the usual path.

We know however, that engagement projects especially in the public sector are desperately looking for best practices. And we understand this mindset. This stems from the need for security. If everything goes sour, you can still point to the fact that you did exactly what others had done. So the reason for failure can’t be you.

However this approach is the ultimate blocker for innovation. And in times of enormous challenges paired with low levels of engagement, we desperately need innovation to move forward. Therefore we advise to take risks, to approach engagement differently. Don’t always look at what others are doing and how. Their best practices are mostly also not really good anyways. Instead it is worth thinking about what you want to achieve, what you could do differently and where to find inspiration outside of the engagement sector.

It is part of innovation that you will fail. If you take a risk, you could fail completely. But we believe that only this innovative mindset will ultimately lead to engagement projects that really unleash the power that community can have. We hope for more of these inspiring stories in the future.

If you want to think together, write us at


bottom of page