How to get great attendee evaluation response rates

evaluation 4026043182_c4525339ed_oI routinely get ~70% response rate for evaluations of conferences I facilitate. Here are three reasons why this rate is so much higher than the typical 30-50% response rates that other conference organizers report.

  1. At the start of the first session at the event, I request that attendees fill out the online evaluations and explain why we want them to do so.
  2. I promise attendees that all their feedback will be carefully read.
    I tell attendees that their evaluations are crucial for improving the conference the next time it is held.
    I tell attendees that we will share all the (anonymized) evaluations with them. (I don’t share evaluations on individual sessions with attendees, but I forward all comments and ratings to the session presenters. I do share overall ratings and all general comments about the conference.)

    When you explain to attendees why you are asking them to spend time providing evaluations, and they trust you to deliver what you’ve promised, they are much more open to providing feedback. And I suspect that when attendees know that other attendees will see their anonymized feedback, they may be more motivated to express their opinions.

  3. I provide online surveys that are available at the start of the event, and that can be completed at any point. If the conference has a printed learning journal, I’ll include a printed version of the evaluation, so attendees can fill it out as an aide-mémoire during the event if they wish.
  4. Post-conference, via email, I gently remind attendees who have not yet completed an evaluation. I include a due date (normally 10-14 days after the end of the event), and a few sentences reiterating the reasons why we’d appreciate their response. I send up to three of these reminders before the due date.

I don’t find doing any of this particularly onerous, and the result is a rich treasure-trove of ideas and feedback from a majority of attendees that I can use to improve future conferences.

What’s your typical attendee response rate for conference evaluations? What do you do to encourage attendees to provide event feedback?

Photo attribution: Flickr user herzogbr

2 thoughts on “How to get great attendee evaluation response rates

  1. I also get around 70% fill rates on my evaluations and I pretty much apply the same techniques that you do. Two things I do not do: share the online form at the start of the event, and provide a printed copy for people to do it onsite as an aide-memoire. I usually share the evaluation one or two days after the conference ends, along with all the presentations pfds. Then I usually do one or two reminders in the following ten days. But I do explain onsite, several times (at the start of the event, at the end, and sometimes in the middle, depending on length of event) why we want participants to fill out the forms (and they are quite similar to your points). I don’t do printed forms onsite because that is what we used to do in the past, and my fear is that people will just hand them in to us and thus not do the online survey. Then you have a logistics problem of, some are physical papers and others are digital, which means you need to “upload” the physical ones online which takes time. Also, if there are any last minute changes in the program, or questions that arise during the event which you want to ask about in the survey, you cannot do with the printed form, i.e if during a panel or group discussion a topic comes up which is of great interest to participants and should be explored further you may want to add a question about it in the survey. Which also then eliminates the idea of sharing the online survey at the start of the conference, actually.

    1. All good points Thomas. (I should clarify that printed evaluations like the image for this post are not recommended!) At the peer conferences I run, I supply two online evaluations: one for the fixed aspects of the conference, available at the start of the event, and one for the peer sessions that are generated by the participants, made available as soon as the topics have been determined.

Leave a Reply

Your email address will not be published. Required fields are marked *