While musing about Facebook’s recent changes to “prioritize posts that spark conversations and meaningful interactions between people” over content from media and brands, Jeff Jarvis coins a new definition of journalism:
“…convening communities into civil, informed, and productive conversation, reducing polarization and building trust through helping citizens find common ground in facts and understanding.”
—Jeff Jarvis, Facebook’s changes
That sounds a lot like the mission of the participant-driven and participation-rich events I’ve been championing for so long. Journalism can’t provide the connective power of face-to-face meetings. But its potential for helping individuals and communities build trust and find common ground is worthy and welcome.
One of the easiest, yet often neglected, ways for meeting professionals to improve their craft is to obtain (and act on!) client feedback after designing/producing/facilitating an event. So I like to schedule a thirty-minute call at a mutually convenient date one or two weeks after the event, giving the client time to decompress and process attendee evaluations.
During a recent call, a client shared their conference evaluation summaries that rated individual sessions and the overall conference experience.
This particular annual conference uses a peer conference format every few years. The client finds the Conferences That Work design introduces attendees to a wider set of peer resources and conversations at the event. This year, The Solution Room, was a highly rated session for building connections and getting useful, confidential peer consulting on individual challenges.
As the client and I talked, we realized that the evaluations had missed an important component. We were trying to decide how frequently the organization should alternate a peer conference format with more traditional approaches. However, we had no attendee feedback on how participants viewed the effectiveness of the annual event for:
making useful new connections;
getting current professional wants and needs met; and
Adding ratings of these KPIs to conference evaluations provides useful information about how well each event performs in these areas. Over time, conveners will see if/how peer conference formats improve these metrics. I also suggested that we include Net Promoter Scores in future evaluations.
The client quickly decided to include these ratings in future conference evaluations. As a result, our retrospective call helped us to improve how participants evaluate his events. This will provide data that will allow more informed decisions about future conference design decisions.
Do your evaluations allow attendees to rate the connection and just-in-time learning effectiveness of your meeting? Do they rate how well your meeting met current professional wants and needs? If not, consider adding these kinds of questions to all your evaluations. Over time you’ll obtain data on the meeting designs and formats that serve your participants best.