As recently reported by MeetingsNet, Expo! Expo! “will offer all exhibitors access to Zenus AI’s facial-analysis technology after a limited rollout at the 2023 [show]”. However, MeetingsNet also includes the following statement:
“Nicole Bowman, vice president of marketing and communications for IAEE, says that because the technology ‘gives anonymized sentiment about areas [of the show floor], we would not need to, nor did we in 2023, notify attendees’ that it was being used.” [emphasis added]
—Rob Carey, MeetingsNet, Expanded Use of Facial Analysis Coming to Events-Industry Show
So IAEE’s statement directly contradicts Zenus’s own recommendation for clients, which includes notifying attendees about the technology through “signage on-site”, “marketing communications”, and inclusion in “their online terms and conditions”.
The reality of attendee awareness
When meeting attendees learn they are being monitored for age, gender, and emotions, reactions are often negative. For example, in response to Greg Kamprath’s post You Shouldn’t Use Facial Analysis At Your Event, attendees expressed discomfort after discovering that cameras were observing them “pretty much everywhere” at PCMA’s 2023 Convening Leaders event.
As discussed in these posts, the design of Zenus’s technology does not allow individual attendees to opt out. If IAEE followed Zenus’s recommendation to notify attendees that facial analysis would operate “across the entire show floor” and attendees then requested not to be surveilled, IAEE and Zenus would be unable to comply with these requests without shutting down the entire system.
An added complication is that the 2024 IAEE Expo! Expo! will be held in California, where the California Consumer Privacy Act of 2018 (CCPA) grants consumers certain rights over the personal information that businesses collect about them. CCPA regulations define personal information as:
This grey area, which facial analysis technology exploits, remains in legal limbo, as neither definition clarifies whether consumers have the right to opt out.
Still, at the very least, attendees should have the right to request exclusion from facial analysis surveillance. IAEE’s decision not to inform attendees, despite Zenus’s recommendation, suggests an intent to sidestep these legal and ethical issues.
Conclusions
At a minimum, IAEE should comply with Zenus’s request they inform Expo! Expo! attendees that facial analysis technology will be operating across the show floor for exhibitors’ benefit.
Only then might we see how attendees truly feel about such surveillance.
What are your thoughts on using facial analysis to gauge “attendee sentiment”? Do you believe attendees have a legal or ethical right to be informed? Should they have the right to opt out?
There’s no shortage of confident futurists. Unfortunately, based on results, we’re not particularly good at predicting the future.
Why is Predicting the Future Important?
Predicting the future has become increasingly vital as the pace of change in human societies accelerates. Without accurate forecasts, the negative consequences of unforeseen challenges can escalate.
Some global trends offer optimism: Incomes are rising (despite persistent disparities), poverty is steadily declining, people are living longer, wars are fewer and shorter, and the gender gap in education and income is narrowing.
However, many global trends are deeply concerning: Climate disasters, aging populations, increasing pandemic outbreaks, and ongoing geopolitical instability paired with more lethal technologies are just a few of the alarming issues.
Four Reasons We’re Bad at Predicting the Future
1. Human Nature Is Resistant to Change
Despite our desire to believe we are rational beings making decisions based on facts and science, it turns out that people are primarily driven by emotions, not reason.
At first sight, history seems to provide evidence that people can change. For instance, in the United States, a restrictive form of democracy was adopted in 1787, slavery was abolished in 1866, and most women gained the right to vote in 1920. However, these societal changes remain controversial even today.
Pundits often focus on cultural changes, overestimating how people’s fundamental psychological and emotional responses to experiences and issues evolve over time.
2. Technology Changes Everything—Faster and Faster
Alan Kay once said, “Technology is anything that was invented after you were born.”
“We are enveloped by rapidly changing technology and we fixate on what is new. What was new quickly becomes taken for granted and largely invisible. As David Weinberger remarks: “Technology sinks below our consciousness like the eye blinks our brain filters out.”
Although technology in the form of human tools has existed for over three million years and we’ve had books for over half a millennium, the first history of technology wasn’t written until 1954.”
The invisibility of most technology and the escalating pace of innovations make it very difficult to predict how they will impact our lives.
Seth Godin illustrates this as follows:
“In a bad 1950s science fiction movie, you might see flying jetpacks, invisibility cloaks and ray guns.
What we got instead is a device that fits in our pocket. It allows us to connect to more than a billion people. It knows where we are and where we’re going. It has all of our contacts, the sum total of all published knowledge, an artificially intelligent computer that can understand and speak in our language, one of the best cameras ever developed, a video camera with editor, a universal translator and a system that can measure our heart rate. We can look up real time pricing and inventory data, listen to trained actors read us audiobooks and identify any song, any plant or any bird. We can see the reviews from our community of nearby restaurants or even the reputation of a doctor or lawyer. It can track the location of our loved ones and call us a chauffeured vehicle at the touch of a button.
And of course, we use it to have arguments. And to watch very short stupid videos.”
True, some people, like Isaac Asimov, have accurately predicted specific futures, but we tend to remember their successes and overlook the many incorrect predictions.
3. We Struggle to Predict the Speed of Change
Even when people successfully predict future developments, they rarely get the timing right.
happen.
Here are three examples:
Solar energy
In 1978, I started a solar manufacturing company, convinced that solar energy would become an important energy source. For five years we thrived building solar hot water heating systems, selling and installing them around New England. When Ronald Reagan became president he abolished the existing solar tax credits and the solar industry disappeared for twenty years.
We were right, but it took over forty years. Who knew it would take so long? No one!
Fusion power
Research into generating power from nuclear fusion reactions began in the 1940s. The goal of creating a sustained, net-power-producing fusion reactor has been around for fifty years, but fusion continues to remain “10, 20, or 30 years away”, depending on who you ask.
Meeting process
I have been designing and facilitating participant-driven and participation-rich conferences for over thirty years, and evangelizing the advantages of this approach since 2009. While the meeting industry is slowly realizing the importance of facilitating connection at events, traditional conferences still dominate. Whether my approaches will ever become mainstream remains uncertain.
There are many other examples
For example, predicting when the COVID pandemic will be over, the length and severity of wars, and the speed of acceptance of gay marriage in the United States come immediately to mind. You can doubtless think of more.
4. Technology Changes Our Lives in Unpredictable Ways
Futurists’ jobs are made even harder by what Kevin Kelly, in his thought-provoking book What Technology Wants, points out: The technology we create changes humans in ways that we couldn’t imagine.
In the early 1990s, I was one of the first users of the commercial internet. I was an IT consultant at the time and my enthusiastic conviction that the internet would change everything fell on deaf ears. Several CEOs told me later they wished they’d listened to me. Seth Godin supplies examples above of how the internet has indeed changed everything in unexpected ways.
How many of the social impacts of cars were predicted when they began to be mass-produced a century ago?
Or the impacts of developments in religion, law, political systems, medicine, and education on our lives?
Can we predict the future?
In my opinion, anyone who confidently predicts the future is guilty of hubris. Unfortunately, that won’t stop people from trying.
Technology is one of the areas where culturally embedded designs impact society. My 2013 Meeting Professional column described how existing meeting technology becomes invisible. This induces stakeholders to ignore its stultifying influence on meeting process. Similarly, Benjamin explains how “technology has the potential to hide, speed, and even deepen discrimination, while appearing neutral and even benevolent when compared to racist practices of a previous era”.
Here’s the video:
If you prefer to read her remarks, there’s an introduction and a complete transcript here.
Here are three illustrative fragments:
“In fact, we should acknowledge that most people are forced to live inside someone else’s imagination and one of the things we have to come to grips with is how the nightmares that many people are forced to endure are the underside of elite fantasies about efficiency, profit, and social control. Racism among other axes of domination helps produce this fragmented imagination, misery for some, monopoly for others.”
“To paraphrase Claudia Rankine, the most dangerous place for black people is in white people’s imagination.”
“…technology inherits its creators’ biases”
Living in someone else’s imagination is a societal problem that impacts all of us, especially marginalized groups. Let’s strive to live in our own imagination.
In 2009, the biologist E.O. Wilson described what he saw as humanity’s real problem. I think it’s also a meeting problem:
“The real problem of humanity is the following: we have Paleolithic emotions, medieval institutions, and god-like technology.” —E. O. Wilson, debate at the Harvard Museum of Natural History, Cambridge, Mass., 9 September 2009
Wilson sees emotions, institutions, and technology as disjointed in time. Emotions have driven human beings for millions of years, our institutions are thousands of years old, and we can’t keep up with our advances in technology.
Emotions run us; our rationality comes in a distant second. All meeting design needs to recognize this reality.
Institutions
The things we do reflect our culture. And the organizations we’ve constructed incarnate our culture. Our largest and most powerful institutions — political and religious — are also the oldest, with roots thousands of years in the past. What we think of as modern business meetings and conferences are hundreds of years old. Changes in their forms and traditions have been principally influenced by technology (see below) rather than any deep changes in human psychology.
The traditional top-down formats of meetings and conferences reflect the top-down structure of the institutions that still largely dominate our world. Traditional institutional norms discourage the creation of meetings that provide freedom for participants to steer and co-create learning and connection experiences that are optimally better for everyone involved. All too often, top-down institutional culture leads inexorably to hierarchical meeting formats.
So there’s a disconnect between what’s best for meeting participants, due to their fundamental psychological makeup, and the dictates of their institutional bosses and the organizations that organize the events.
Technology
And finally, there’s E.O. Wilson’s “god-like technology”. Even though technology is continually being redefined as anything that was invented after you were born, it’s impossible to ignore how rapidly technology has evolved and changed our culture and our meeting experiences. I carry in my pocket a phone that has more computing power and far more utility to me than a machine that filled an entire office building when I was a student. And the COVID-19 pandemic has vividly illustrated how technology has allowed us, almost overnight, to redefine what we have thought of as meetings for hundreds of years to a largely—at least for now—online experience.
Consequently, vendors flood us with technological “solutions” to problems we often aren’t even aware we have. In some cases, these solutions are actually manufactured for a plausible yet illusory need. But even when there’s a genuine problem that the right technology can solve, our emotions can make it hard for us to see its value, and our institutions may be resistant to implementation.
The tension between emotions, institutions, and technology at meetings
Wilson’s definition of humanity’s problem resonates with me. As I’ve shared above, our emotions, institutions, and technology also frequently conflict when we are planning meetings. There isn’t a simple solution that perfectly responds to these elemental forces that affect what we do. In the meetings industry, our best meeting problem solutions recognize the effects of these forces on our gatherings and use conscious design to take advantage of them.
That means designing meetings that incorporate active learning via creating emotional experiences together, working with institutional stakeholders to convince them of the value of emotion-driven, participant-driven, and participation-rich approaches, and using the right technology — often human process technology — to make our meetings the best they can be.
Yes, humanity’s problem is a meeting problem. But we have the tools to solve it. All we need to do is to use them.
There’s a better way to improve meetings than augmenting them with technology. As Finnish management consultant and polymath Esko Kilpi says:
“Human beings augmented by other human beings is more important than human beings augmented by technology” —Esko Kilpi, quoted by Harold Jarche
At face-to-face meetings, we can facilitate relevant connections and learning around participants’ shared just-in-time wants and needs. This is more effective than augmenting an individual’s learning via technology. We maximize learning when:
Participants first become aware, collectively and individually, of the room’s wants, needs, and available expertise and experience (i.e. “the smartest person in the room is the room” — David Weinberger, Too Big To Know);
We use meeting process that successfully matches participants’ needs and wants with the expertise and experience available; and
Time and space are available for the desired learning to take place.
And of course, this approach significantly improves the quantity and quality of relevant connections made by participants during an event.
So the smart choice is to invest in maximizing peer connection and learning. Do this via simple human process rather than elaborate event technology.
I’ve wasted time at many events trying to use apps to connect attendees in some useful way. Even when high-tech approaches use a simple web browser interface, getting 100% participation is difficult due to technical barriers: all attendees must have a digital device readily available with no low batteries or spotty/slow internet access.
Well-facilitated human process has none of these problems. The value of having a facilitator who knows how to do this work far exceeds the cost (which may be zero once you have invested in training staff to fulfill this function).
When push comes to shove, modern events thrive in supportive, participatory environments. Attendees appreciate the ease of making the connections they want and getting the learning they need from the expertise and experience of their peers. Once they’ve experienced what’s possible they rarely enjoy going back to the passive meetings that are still so common.
Yes, we can use technology to augment learning. But the majority of the high-tech event solutions marketed today are inferior and invariably more costly to implement than increasing learning and connection through radically improving what happens between people at our meetings.
Companies are now marketing services for artificial intelligence matchmaking at events. However, unresolved issues could impede the adoption of this technology, especially by attendees.
Consider this marketing pitch for an artificial intelligence event matchmaking service:
“Using the [AI] platform…it’s easier for attendees to make sure they have the right meetings set up, and for exhibitors to have a higher return on investment in terms of connections with high-quality buyers.” —Tim Groot, CEO Grip, as quoted in What AI Means To Meetings: How Artificial Intelligence will boost ROI, Michael Shapiro, July 2017 Meetings & Conventions Magazine
A win-win for exhibitors and attendees?
Tim describes using artificial intelligence matchmaking at events as a win for both exhibitors and attendees.
I’m skeptical.
Let’s assume, for the moment, that the technology actually works. If so, I think suppliers will reap most of the touted benefits, quite possibly at the expense of attendees. Here’s why.
Successful matchmaking needs digital data about attendees. An AI platform cannot work without this information. Where will the data come from? Tim explains that his service builds a profile for each attendee. Sources include “LinkedIn, Google, and Facebook”, while also “scouring the web for additional information”.
Using social media platform information, even if attendee approval is requested first, creates a slippery slope, as privacy issues in meeting apps remain largely undiscussed and little considered by attendees during the rush of registration. The end result is that the AI matchmaking platform gains a rich reservoir of data about attendees that, without strong verifiable safeguards, may be sold to third parties or even given to suppliers.
In addition, let’s assume that exhibitors get great information about whom to target. The result: “high-value” attendees will be bombarded with even more meeting requests while attendees who don’t fit the platform’s predictions will be neglected.
In my opinion, the best and most likely to succeed third-party services for meetings are those that provide win-win outcomes for everyone concerned. Unfortunately, it’s common (and often self-serving) to overlook a core question about meeting objectives —whom is your event for? — and end up with a “solution” that benefits one set of stakeholders over another.
How well will artificial intelligence matchmaking at events work for attendees?
Artificial intelligence is hot these days, so it’s inevitable that event companies talk about incorporating it into their products, if only because it’s a surefire way to get attention from the meetings industry.
I know something about AI because in the ’80s I was a professor of computer science, and the theory of artificial neural networks — the heart of modern machine learning — was thirty years old. AI had to wait, however, for the introduction of vastly more potent technology to allow practical implementation on today’s computers.
While the combination of powerful computing and well-established AI research is demonstrating incredible progress in areas such as real-time natural language processing and translation, I don’t see why sucking social media and registration data into a database and using AI to look for correlations is going to provide attendee matchmaking that is superior to what can be achieved using participant-driven and participation-rich meeting process combined with attendees’ real-time event experience. (Once again, exhibitors may see a benefit from customized target attendee lists, but I’m looking for a win-win here.)
From the attendee’s point of view
When attendees enter a meeting room there’s a wealth of information available to help make relevant connections. Friends introduce me to people I haven’t yet met. Eavesdropping on conversations opens up more possibilities. Body language and social groupings also provide important potential matchmaking information. An AI matchmaking database includes none of these resources. All of them have led me (and just about everyone who’s ever attended meetings) to professional connections that matter.
Coda
I’ll conclude with a story. The June 2017 PCMA Convene article Can Artificial Intelligence Make You a Better Networker? describes a techsytalk session by Howard Givner where he “gave particular emphasis to the importance of facilitated matchmaking at events.” I like to think that Howard discovered this when he attended the participant-driven and participation-rich EventCamp East Coast I designed and facilitated in 2010, about which he wrote:
“…it was one of the most innovative and eye-opening professional experiences I’ve had. Aside from coming back with lots of new tips and ideas, I easily established triple the number of new contacts, and formed stronger relationships with them, than at any other conference I’ve been to.”
As we install more machines that replace former high-level work, we’re seeing fewer decent-paid jobs and the need for universal basic income.
In the summer of 1970, I had a cool teenage vacation job. I wrote computer programs for a trucking company in downtown Los Angeles. After I finished coding a new report, my boss asked me to share it with a small department’s employees. I told the fifteen people there what I had done. And I saw their horror as we realized that my report replaced what they had been doing for a paycheck.
I felt terrible about the consequences of my work. I felt angry with my boss who knew exactly what would happen. He had made me the unwitting messenger of bad news. I never found out the consequences, but we’ve all heard countless stories like this.
We are building and bringing to market machines that perform what were formerly:
high-level executive functions (e.g., financial and legal advice); and
“job-safe” manual labor (industrial and service robots).
I believe we are at a tipping point where the unexamined assumption that there will somehow always be enough paid work for people is breaking down.
Our children have a much harder time landing a “good job” unless they have an ever-shrinking set of high-level, constantly shifting skills.
It’s time to face an unpleasant reality. The notion that there will be enough paid jobs to allow workers to make a decent living may no longer be sustainable.
Sasha Dichter writes about “modern, techno-optimistic” solutions to important problems, as characterized by the quick-fix phrase “I’ve just heard about a great new ______ that will solve the ______ problem!”
Worth reading. Especially the article’s conclusion:
“Indeed, everyone I know who is changing the world is in the long-haul business.” —Sasha Dichter, The Long Haul
Image attribution: Unknown—if it’s yours, let me know!
Broadcast is the hundreds-of-years-dominant paradigm for sessions, conferences, and meetings. Most of the time, one person presents and everyone else listens and watches. Why?
“Things are the way they are because they got that way.” —Quip attributed to Kenneth Boulding
I think there are two principal historical reasons: one shaped by technology, the other by culture.
How technology shapes our system of education
Perhaps you’re thinking: Technology? Isn’t technology a relatively recent development? How could technology have influenced how we learned hundreds of years ago?
To answer these questions, let’s take a journey back in time. It’ll take a while, but stay with me! I’ll shine some light on some rarely-examined foundations of our current educational paradigm.
Understandably, we tend to think of technology these days as material devices like cars, printers, and smartphones or, increasingly, as computer programs: software and apps. But this is an incredibly restrictive viewpoint. Such a definition of what is and isn’t “technology” is far too narrow.
What is “technology”?
“Technology is anything that was invented after you were born.” —Alan Kay, at a Hong Kong press conference in the late 1980s
An older reader will immediately recognize a typewriter, but a child might stare in puzzlement at a 1945 Smith-Corona Sterling. A device found on a table at a yard sale appears to be a piece of rusty sculpture until a Google search reveals it’s a ninety-year-old cherry stoner. By Alan Kay’s definition, anything made after you became aware is technology. Anything really old, we don’t even recognize as technology!
This worldview exists because human beings are incredibly good at adapting to new circumstances. Such an ability greatly increases our chances of surviving a hostile and treacherous world. But there’s a downside. When we start making changes to our environment by making useful things, what was once new becomes a part of our everyday existence. In the process, what was formerly new becomes largely invisible to our senses, focused as they are on the new and unexpected. As David Weinberger remarks: “Technology sinks below our consciousness like the eye blinks our brain filters out.”
A wider definition of technology
So let’s adopt a wider definition of technology and see where it takes us. I’ve been influenced here by Kevin Kelly, in his thought-provoking book What Technology Wants.
Technology is anything made to solve a problem. —Adrian’s definition, a paraphrase of Wikipedia’s definition of technology
This definition is useful because it opens our eyes to technology that we have been using for a very long time.
Science, writing, and language
For example, by this definition, science is technology! Science is just a way that we’ve invented to understand the patterns we notice in the world we live in.
Science is old. Writing is older; it allows us to communicate asynchronously with each other.
Writing is technology!
And oldest of all—we don’t know how old—language is technology. Every culture and tribe has its language it has invented to solve the problem of real-time communication between its members.
These technologies are so old that they are invisible to us. They are part of our culture, the human air we breathe. Language, writing, and science are tools outside our conventional, narrow-scope view of technology. We instantiate these tools using invented conventions: sounds, gestures, and symbols. These sounds, gestures, and symbols, however, are secondary features of these ancient technologies. Ultimately, language, writing, and science are primarily about human process.
Human process technology
Human process has become the most invisible technology. It is inexorably and continually built into every one of us by our culture, starting the moment we are born, before we can speak, write, or reason. Our culture teaches us throughout our lives the signs, sounds, and movements that signify. We are superbly equipped to learn to speak, write, and think before we have any self-awareness of what we are being taught.
“We seldom realize, for example that our most private thoughts and emotions are not actually our own. For we think in terms of languages and images which we did not invent, but which were given to us by our society.” —Alan Watts, The Book on the Taboo Against Knowing Who You Are
Our awareness of the processes we constantly use to learn and make sense of the world and to connect with others is minimal. It’s like breathing, largely automatic and unconscious. As a result, the old process technology that we adopted for practical purposes long before recorded history continues to shape our lives today.
Think for a moment about the impact of language on our species. Before language arose, we had no way to transfer what we learned during our all-too-brief lives to our tribe and following generations. “These plants are safe to eat.” “You can make a sharp spearhead from this rock.” “Snakes live in that cave.” Every individual had to painfully acquire such learning from scratch. Language allowed parents and tribe elders to pass on valuable knowledge orally, improving survival and quality of life
Similarly, the later development of writing made it possible to share, physically transfer, and expand a permanent repository of human knowledge. The evolution of the process methodology of science enabled us to design experiments about our world, codify the patterns we discovered, and turn them into inventions that transform our lives.
The effect of technology on education
Now we’re ready to consider the effect of the historical development of language, writing, and science on education. For almost all of human history, language was our dominant mode of communication and our single most important educational tool. If you wanted to learn something you had to travel physically to where someone knew what you needed to learn and they would then tell it to you. Eventually, schools developed: establishments for improving the efficiency of oral communication of information by bringing many students together so they could learn simultaneously from one teacher.
Language reigned supreme for millennia, thus becoming an invisible technology. Only when writing became established it was finally possible to asynchronously transmit information. By that time, the model of the single teacher and multiple students was buried deep in our collective psyche, and, to a large extent, the book paradigm mirrored the language process since most books were written by a single expert and absorbed by a much larger number of readers.
(The very word lecture beautifully illustrates the adoption of old models that took place during the development of writing. The word is derived from the Latin lectūra which means—to read! The first books were so rare that a group who wished to study a book’s content would have someone read the book out loud while the others copied down what they heard.)
Even science started as an individual enterprise. The early study of “natural philosophy” by Socrates, Aristotle, and others used an oral teacher-student model. Although science today is largely an intensely cooperative enterprise, we still see considerable leftovers of the older invisible technologies in its societal organization: prescribed progressions towards mastery of fields, formal paths to tenure, the format of academic meetings, etc.
The effects of invisible technologies
What are the effects of these powerful invisible technologies on our educational archetypes? Technologies like language, writing, and science are thousands of years old. So it becomes very difficult for people to consider learning models other than broadcast. Even though other models may be far more appropriate these days.
The earliest organized religious schools are a few thousand years old. The oldest non-religious universities began nearly a thousand years ago. For centuries, oral learning was the predominant modality in what we would recognize as schools. It wasn’t until the invention of the printing press in the fifteenth century that a significant number of people were able to learn independently from books and newspapers, which are, of course, still a form of broadcast media.
Even though the invention of inexpensive mass-printing revolutionized society, the old broadcast teaching models were sunk so deeply and invisibly into our culture that they have persisted to this day. When you are taught by broadcast by teachers who were taught by broadcast it is not surprising that when you are asked to teach in turn, you employ the same methods. And this ancient cultural conditioning, which we are largely unaware of, is very difficult to break.
As adults, when we create a meeting we are thus naturally primed to choose a broadcast paradigm for the “learning” portions. As a society, we are mostly unaware of our conditioning by past centuries of broadcast learning. And when it is brought to our attention, it is still very difficult for an individual to break away from the years of broadcast process to which he has been subjected as a child.
The process we’ve been using for so long inhibits our ability to consider alternatives. But the quantity of “knowledge” that we currently expect adults to possess also plays a role. This leads us to the second reason why broadcast methodology infuses meetings.
How culture shapes our system of education
For most of human history, learning was predominantly experiential. Life expectancy was low by modern standards and formal education nonexistent. Even after schools began to become important institutions, curricula were modest. In the Middle Ages, formal education of children was rare; in the fifteenth century, only a small percentage of European children learned to read and write, usually as a prerequisite for acceptance as a guild apprentice.
Up until around a hundred years ago, advanced education was only available for a tiny number of students. The expectations for those entering university were laughable by today’s standards. Isaac Newton, for example, received no formal mathematics teaching until he entered Trinity College, Cambridge in 1661. Students didn’t routinely learn algebra, even at university, until the eightieth century. In the Victorian era, secondary school students mastered the “three R’s”—reading, writing, and ‘rithmetic—plus perhaps a few other topics like needlework (girls only), geography, and history.
The drivers of education
The need for jobs has driven education since the birth of apprenticeship programs in the Middle East four millennia ago. Apprenticeship remained the dominant model of education until the advent of the Industrial Revolution when apprenticeship no longer matched the growing needs for workers just-enough capable of handling repetitive work plus some with specialized new trainable skills like bookkeeping and shop work. A period of emphasis on career and technical education ensued. Once formal education became a social and legislative requirement for a majority of children, curriculum wars erupted between the conflicting goals of content and pedagogy. These wars have been with us in some form ever since.
Whatever you think about the relative merits of “traditionalist” and “progressive” approaches to education (see Tom Loveless’s The Curriculum Wars for a good overview), the key cultural reason why broadcast methods remain firmly embedded in our children’s education is the sheer quantity of knowledge that society—for whatever reasons—is determined to cram into young heads during formal education. As the brief history above illustrates, we now require young adults to absorb a staggering diversity and quantity of topics compared to our expectations of the past.
As a result, there is no way to teach this added knowledge experientially in the time available. It took centuries for some of our brightest minds to formulate the algebra that today we routinely teach to eleven-year-olds! While we have probably developed better paths and techniques for sharing this educational content, any increased efficiency in delivery has not kept pace with the massive increase in expected knowledge mastery.
Why meetings perpetuate broadcast education
It is this significant cultural imposition that requires us to use primarily broadcast methods to educate our young in school. The mistake we make is to assume that the broadcast learning we received as kids should continue into adulthood. This is why meetings continue to concentrate on broadcast learning modes. Every one of us is conditioned by an overwhelming exposure to broadcast teaching in our youth.
Receiving specialized adult learning from an expert made sense for human history up until the industrial age. Now that information is moving into systems outside our brains, we have an urgent need to use adult learning modalities that do not concentrate on packing information into our heads. Instead, we’ll find that most of what we need to learn to do our jobs today is based on working informally and creatively with novel problems with solutions that need just-in-time information from our peers.
We find it hard to stop conference lecturing because it’s the dominant learning modality during our formal education before adulthood. Being taught in school, however inefficiently, via lecture about the amazing things humans have created, discovered, and invented indoctrinates us to believe that lecturing is the normal way to learn. That’s why we continue to inflict lecturing on conference audiences. It’s what we’re used to. Sadly, we’re mostly unfamiliar with alternative and more effective learning modalities that are more and more important in today’s world.
Yes, meetings are a mess!
If you’d like to read more about the ideas shared here, and also learn about how to make meetings powerful places for learning, connection, engagement, community-building, and action, check out my book The Power of Participation.
At every event I’ve ever attended, tap water has been free while bottled water usually costs money. Which leads to my Wi-Fi manifesto. I propose that organizers supply Wi-Fi like water at events.
These days, event Wi-Fi is a utility. People need a Wi-Fi connection, even when they are physically together in the same space. I know that providing Wi-Fi costs money—but so does providing water.
I believe that event organizers should, at a minimum, provide base level rate limited free Wi-Fi throughout the meeting spaces of the venue, plus an optional paid higher-performance tier of service.
The free Wi-Fi would be rate limited to somewhere in the region of 100-300kB/sec per device, irrespective of the number of devices each attendee brought. The paid tier would provide a higher bandwidth, appropriate to attendee needs.
“…for a 250-room hotel, the cost is about $2.50-$4.50 per room, per month.”
This infographic breaks down the costs, which work out to 10-15 cents a day. That’s $20-30/day for an event with 200 attendees. (At this point you may be wondering why some hotels charge $14.95/day for internet access per device. This is called “making money hand over fist”.)
None of this is hard anymore
Rate limiting internet bandwidth for individual users is simple due to the incorporation of Quality of Service (QoS) policies in modern inexpensive routers and access points. You don’t even need two sets of access points for different bandwidth tiers; you can support multiple discrete Wi-Fi networks (VLANs) on a single access point. Finally, ramping up bandwidth and reliability for high-demand events is now relatively straightforward because most systems support bandwidth aggregation, allowing multiple internet service providers to seamlessly provide bandwidth from more than one circuit.
Attendees don’t expect events to provide high bandwidth internet access for free (though they’ll love you if you do). But, like a tap to fill your water bottle, bandwidth that’s sufficient for basic tasks like checking email, interacting on social media and light web browsing should be available for free at every event.
Like Water for Chocolate Wi-Fi. That’s my manifesto.
Want to join me—or am I dreaming? What do you think?