First, ask yourself the following about every question you ask:
Are you asking questions capable of making change happen? After the survey is over, can you say to the bosses, “83% of our customer base agrees with answer A, which means we should change our policy on this issue.”
It feels like it’s cheap to add one more question, easy to make the question a bit banal, simple to cover one more issue. But, if the answers aren’t going to make a difference internally, what is the question for? —Seth Godin
In other words, if any question you ask doesn’t have the potential to lead you to change anything, leave it out!)
Second, think about Seth’s sobering experience on responding to “Any other comments?” style questions:
Here’s a simple test I do, something that has never once led to action: In the last question of a sloppy, census-style customer service survey, when they ask, “anything else?” I put my name and phone number and ask them to call me. They haven’t, never once, not in more than fifty brand experiences.
Gulp. Would your evaluation process fare any better? As Seth concludes:
If you’re not going to read the answers and take action, why are you asking?
I routinely get ~70% response rate for evaluations of conferences I facilitate. Here are three reasons why this rate is so much higher than the typical 30-50% response rates that other conference organizers report.
At the start of the first session at the event, I request that attendees fill out the online evaluations and explain why we want them to do so.
I promise attendees that all their feedback will be carefully read. I tell attendees that their evaluations are crucial for improving the conference the next time it is held. I tell attendees that we will share all the (anonymized) evaluations with them. (I don’t share evaluations on individual sessions with attendees, but I forward all comments and ratings to the session presenters. I do share overall ratings and all general comments about the conference.)
When you explain to attendees why you are asking them to spend time providing evaluations, and they trust you to deliver what you’ve promised, they are much more open to providing feedback. And I suspect that when attendees know that other attendees will see their anonymized feedback, they may be more motivated to express their opinions.
I provide online surveys that are available at the start of the event, and that can be completed at any point. If the conference has a printed learning journal, I’ll include a printed version of the evaluation, so attendees can fill it out as an aide-mémoire during the event if they wish.
Post-conference, via email, I gently remind attendees who have not yet completed an evaluation. I include a due date (normally 10-14 days after the end of the event), and a few sentences reiterating the reasons why we’d appreciate their response. I send up to three of these reminders before the due date.
I don’t find doing any of this particularly onerous, and the result is a rich treasure-trove of ideas and feedback from a majority of attendees that I can use to improve future conferences.
What’s your typical attendee response rate for conference evaluations? What do you do to encourage attendees to provide event feedback?