A novel way to assess consensus

assess consensus: a blurred photograph of an audience overlaid with an illustration of people humming at different volumes

Chapter 44 of my book The Power of Participation explains how facilitators use participatory voting to provide public information about viewpoints in the room, paving the way for further discussion. In particular, we often use participatory voting to assess consensus.

It’s often unclear whether a group has formed a consensus around a specific viewpoint or proposed action. Consensual participatory voting techniques can quickly show whether a group has reached or is close to consensus, or wants to continue discussion.

Methods to assess consensus

For small groups, Roman voting (The Power of Participation, Chapter 46) provides a simple and effective method of assessing agreement.

However, Roman voting isn’t great for large groups, because participants can’t easily see how others have voted. Card voting (ibid, Chapter 47) works quite well for large groups, but it requires:

  • procurement and distribution of card sets beforehand; and
  • training participants on how to use the cards.

A novel way to assess consensus with large groups

I recently came across a novel (to me) way to explore large group consensus. This simple technique requires no training or extra resources. In addition, it’s a fine example of semi-anonymous voting: group voting where it’s difficult to determine how individuals vote without observing them during the process. [Dot voting (ibid, Chapter 49), is another semi-anonymous voting method.]

Want to know how it works?

By using humming!

Humming: a tool to assess consensus

Watch this 85-second video to see how it works.

I learned of this technique from the New York Times:

“The Internet Engineering Task Force eschews voting, and it often measures consensus by asking opposing factions of engineers to hum during meetings. The hums are then assessed by volume and ferocity. Vigorous humming, even from only a few people, could indicate strong disagreement, a sign that consensus has not yet been reached.”
‘Master,’ ‘Slave’ and the Fight Over Offensive Terms in Computing, Kate Conger, NY Times, April 13, 2021

The Internet Engineering Task Force (IETF)is “the premier Internet standards body, developing open standards through open processes.” We all owe the IETF a debt, as they are largely responsible for creating the technical guts of the internet.

On Consensus and Humming in the IETF

The IETF builds consensus around key internet protocols, procedures, programs, and concepts and eventually publishes them as RFCs (Request for Comments). And, yes, there is a wonderful RFC 7282 On Consensus and Humming in the IETF (P. Resnick, 2014).

Software engineer Ana Ulin explains further:

‘In IETF discussions, humming is used as a way to “get a sense of the room”. Meeting attendees are asked to “hum” to indicate if they are for or against a proposal, so that the person chairing the meeting can determine if the group is close to a consensus. The goal of the group is to reach a “rough consensus”, defined as a consensus that is achieved “when all issues are addressed, but not necessarily accommodated”.’
Ana Ulin, Rough Consensus And Group Decision Making In The IETF

When assessing rough consensus, humming allows a group to experience not only the level of agreement or disagreement but also the extent of strong opinions (like Roman voting’s up or down thumbs). And if the session leader decides to hold further discussion, the group will have an idea of who holds such opinions.

Voice voting

This technique to assess consensus reminds me of voice voting at New England Town Meetings, which have been annual events in my home state, Vermont, since 1762. But, though it’s possible to hear loud “Aye” or “Nay” votes, humming makes strong opinions easier to detect.

Conclusion

RFC 7282 starts with an aphorism expressed by Dave Clark in 1992 on how the IETF makes decisions:

“We reject: kings, presidents and voting.”

“We believe in: rough consensus and running code.”

Replace “running code” with your organization’s mission, and you may just have the core of an appealing approach to decision-making in your professional environment.

And let us know in the comments below if you try using humming as a tool to assess consensus!

Video “Please Hum Now: Decision Making at the IETF” courtesy of Niels ten Oever
Image background by Robert Scoble

Participatory voting at events: Part 2—Low-tech versus high-tech voting

Low-tech and high-tech voting: a photograph of low-tech RSQP voting using sticky notes on wall-mounted flipchart paperIn Part 1 of this series, I defined participatory voting and we explored the different ways to use it to obtain public information about viewpoints and participants in the room, paving the way for further useful discussions and conversations. Now let’s explore low-tech and high-tech voting solutions.

High-tech voting

There is no shortage of high-tech systems that can poll an audience. Commonly known as ARSs, Student Response Systems (SRSs), or “clickers,” these systems combine an audience voting method—a custom handheld device, personal cell phone/smartphone, personal computer, etc.—with a matched receiver and software that processes and displays responses.

Here are three reasons why high-tech ARSs may not be the best choice for participatory voting:

  • ARSs necessitate expense and/or time to set up for a group. No-tech and low-tech approaches are low or no cost and require little or no preparation.
  • Most ARS votes are anonymous; no one knows who has voted for what. When you are using voting to acquire information about participant preferences and opinions, as opposed to deciding between conflicting alternatives, anonymous voting is rarely necessary. (An exception is if people are being asked potentially embarrassing questions.) When a group of people can see who is voting for what (and, with some techniques, even the extent of individual agreement/disagreement), it’s easy to go deeper into an issue via discussion or debate.
  • Participatory voting techniques involve more movement than pushing a button on an ARS device. This is important, because physical movement improves learning. Some techniques include participant interaction, which also improves learning.

While there are times when low-tech or high-tech voting is the right choice, I prefer no-tech and low-tech techniques for participatory voting whenever possible. No-tech techniques require only the attendees themselves. Low-tech approaches use readily available and inexpensive materials such as paper and pens.

No-tech and low-tech

Wondering what no-tech and low-tech voting techniques can be used for participatory voting? Here’s a list, taken from a glossary of participation techniques covered in detail in my book The Power of Participation: Creating Conferences That Deliver Learning, Connection, Engagement, and Action.

Body/Continuum Voting: See Human Spectrograms.

Card Voting: Provides each participant with an identical set of colored cards that can be used in flexible ways: typically for voting on multiple-choice questions, consensus voting, and guiding discussion.

Dot Voting: A technique for public semi-anonymous voting where participants are given identical sets of one or more colored paper dots which they stick onto paper voting sheets to indicate preferences.

Hand/Stand Voting: In hand voting, participants raise their hands to indicate their answer to a question with two or more possible answers. Stand voting replaces hand raising with standing.

Human Graphs: See Human Spectrograms.

Human Spectrograms: Also known as body voting, continuum voting, and human graphs. A form of public voting that has participants move in the room to a place that represents their answer to a question. Human spectrograms can be categorized as one-dimensional, two-dimensional, or state-change.

Idea swap: A technique for anonymous sharing of participants’ ideas.

One-dimensional Human Spectrograms: Human Spectrograms where participants position themselves along a line in a room to portray their level of agreement/disagreement with a statement or a numeric response (e.g. the number of years they’ve been in their current profession.)

Plus/Delta: A review tool that enables participants to quickly identify what went well at a session or event and what could be improved.

Post It!: A simple technique that employs participant-written sticky notes to uncover topics and issues that a group wants to discuss.

Roman Voting: Roman Voting is a public voting technique for gauging the strength of consensus.

State-change Human Spectrograms: Human Spectrograms where participants move en masse from one point to another to display a change of some quantity (e.g. opinion, geographical location, etc.) over time.

Table Voting: A technique used for polling attendees on their choice from pre-determined answers to a multiple-choice question, and/or for dividing participants into preference groups for further discussions or activities.

Thirty-Five: A technique for anonymously evaluating participant ideas.

Two-dimensional Human Spectrograms: Human Spectrograms where participants position themselves in a two-dimensional room space to display relative two-dimensional information (e.g. where they live with reference to a projected map.)

This ends my exploration of low-tech and high-tech voting solutions. And what are public, semi-anonymous, and anonymous voting? We’ll explain these different voting types and explore when they should be used in the third part of this series.