Event design may be more important than you think. I’m going to argue that event design changes society. And I’ve got legendary communications theorist Marshall McLuhan and computer scientist Alan Kay on my side!
User Experience and Interface Design
My inspiration is an interesting Hewlett Packard Enterprise article: 15 books that influenced top UX and UI influencers by Joe Stanganelli. (UX and UI are abbreviations for computer hardware/software User eXperience and User Interface design.) Here’s an excerpt:
“Many UX and UI specialists take a great deal of inspiration and learning from books that have little, if anything, to do with UX.
This is particularly true for storied computer scientist Alan Kay, one of the inventors of the modern graphical user interface. A self-described “voracious reader” since age 3, Kay provided me with a list of more than two dozen authors—let alone books—that had a profound and “shocking” effect on his thinking and work. The works of one particular author, however, gets special attention from Kay: media theorist H. Marshall McLuhan. Kay points to three McLuhan works from the 1960s:
Together, says Kay, these books identify a fundamental concept of UX and human factors: that humans evolutionarily adapt their ways of thinking to fit communication technologies. Thus, design changes society [emphasis added].”
Event design changes society
Our society is defined by our communication technologies. As these technologies evolve, society adapts to them and is changed by them. For example, the development and design of touch interfaces have revolutionized what we can do on modern phones. Providing inexpensive, ubiquitous communication and knowledge retrieval to most of the human race has significantly changed society.
Event design has a similar impact. Readers of this blog will know that I’m not equating event design with glitz or logistics. Rather, event design is about what happens during an event, which is supported by the designed processes that make participant-driven and participation-rich meetings fundamentally different from the old information broadcast model.
Meetings have become a crucial supplier of professional — and hence — societal development. Traditional events espouse the outdated philosophy that only a few people should talk while the rest listen. As these designs fade away, participatory meeting models take their place. New, superior designs foster attendee connection and participation around wanted and needed learning.
Such fundamental transformation of event process inevitably creates societal transformation. Event design changes society!
Broadcast is the hundreds-of-years-dominant paradigm for sessions, conferences, and meetings. Most of the time, one person presents and everyone else listens and watches. Why?
“Things are the way they are because they got that way.” —Quip attributed to Kenneth Boulding
I think there are two principal historical reasons: one shaped by technology, the other by culture.
How technology shapes our system of education
Perhaps you’re thinking: Technology? Isn’t technology a relatively recent development? How could technology have influenced how we learned hundreds of years ago?
To answer these questions, let’s take a journey back in time. It’ll take a while, but stay with me! I’ll shine some light on some rarely-examined foundations of our current educational paradigm.
Understandably, we tend to think of technology these days as material devices like cars, printers, and smartphones or, increasingly, as computer programs: software and apps. But this is an incredibly restrictive viewpoint. Such a definition of what is and isn’t “technology” is far too narrow.
What is “technology”?
“Technology is anything that was invented after you were born.” —Alan Kay, at a Hong Kong press conference in the late 1980s
An older reader will immediately recognize a typewriter, but a child might stare in puzzlement at a 1945 Smith-Corona Sterling. A device found on a table at a yard sale appears to be a piece of rusty sculpture until a Google search reveals it’s a ninety-year-old cherry stoner. By Alan Kay’s definition, anything made after you became aware is technology. Anything really old, we don’t even recognize as technology!
This worldview exists because human beings are incredibly good at adapting to new circumstances. Such an ability greatly increases our chances of surviving a hostile and treacherous world. But there’s a downside. When we start making changes to our environment by making useful things, what was once new becomes a part of our everyday existence. In the process, what was formerly new becomes largely invisible to our senses, focused as they are on the new and unexpected. As David Weinberger remarks: “Technology sinks below our consciousness like the eye blinks our brain filters out.”
A wider definition of technology
So let’s adopt a wider definition of technology and see where it takes us. I’ve been influenced here by Kevin Kelly, in his thought-provoking book What Technology Wants.
Technology is anything made to solve a problem. —Adrian’s definition, a paraphrase of Wikipedia’s definition of technology
This definition is useful because it opens our eyes to technology that we have been using for a very long time.
Science, writing, and language
For example, by this definition, science is technology! Science is just a way that we’ve invented to understand the patterns we notice in the world we live in.
Science is old. Writing is older; it allows us to communicate asynchronously with each other.
Writing is technology!
And oldest of all—we don’t know how old—language is technology. Every culture and tribe has its language it has invented to solve the problem of real-time communication between its members.
These technologies are so old that they are invisible to us. They are part of our culture, the human air we breathe. Language, writing, and science are tools outside our conventional, narrow-scope view of technology. We instantiate these tools using invented conventions: sounds, gestures, and symbols. These sounds, gestures, and symbols, however, are secondary features of these ancient technologies. Ultimately, language, writing, and science are primarily about human process.
Human process technology
Human process has become the most invisible technology. It is inexorably and continually built into every one of us by our culture, starting the moment we are born, before we can speak, write, or reason. Our culture teaches us throughout our lives the signs, sounds, and movements that signify. We are superbly equipped to learn to speak, write, and think before we have any self-awareness of what we are being taught.
“We seldom realize, for example that our most private thoughts and emotions are not actually our own. For we think in terms of languages and images which we did not invent, but which were given to us by our society.” —Alan Watts, The Book on the Taboo Against Knowing Who You Are
Our awareness of the processes we constantly use to learn and make sense of the world and to connect with others is minimal. It’s like breathing, largely automatic and unconscious. As a result, the old process technology that we adopted for practical purposes long before recorded history continues to shape our lives today.
Think for a moment about the impact of language on our species. Before language arose, we had no way to transfer what we learned during our all-too-brief lives to our tribe and following generations. “These plants are safe to eat.” “You can make a sharp spearhead from this rock.” “Snakes live in that cave.” Every individual had to painfully acquire such learning from scratch. Language allowed parents and tribe elders to pass on valuable knowledge orally, improving survival and quality of life
Similarly, the later development of writing made it possible to share, physically transfer, and expand a permanent repository of human knowledge. The evolution of the process methodology of science enabled us to design experiments about our world, codify the patterns we discovered, and turn them into inventions that transform our lives.
The effect of technology on education
Now we’re ready to consider the effect of the historical development of language, writing, and science on education. For almost all of human history, language was our dominant mode of communication and our single most important educational tool. If you wanted to learn something you had to travel physically to where someone knew what you needed to learn and they would then tell it to you. Eventually, schools developed: establishments for improving the efficiency of oral communication of information by bringing many students together so they could learn simultaneously from one teacher.
Language reigned supreme for millennia, thus becoming an invisible technology. Only when writing became established it was finally possible to asynchronously transmit information. By that time, the model of the single teacher and multiple students was buried deep in our collective psyche, and, to a large extent, the book paradigm mirrored the language process since most books were written by a single expert and absorbed by a much larger number of readers.
(The very word lecture beautifully illustrates the adoption of old models that took place during the development of writing. The word is derived from the Latin lectūra which means—to read! The first books were so rare that a group who wished to study a book’s content would have someone read the book out loud while the others copied down what they heard.)
Even science started as an individual enterprise. The early study of “natural philosophy” by Socrates, Aristotle, and others used an oral teacher-student model. Although science today is largely an intensely cooperative enterprise, we still see considerable leftovers of the older invisible technologies in its societal organization: prescribed progressions towards mastery of fields, formal paths to tenure, the format of academic meetings, etc.
The effects of invisible technologies
What are the effects of these powerful invisible technologies on our educational archetypes? Technologies like language, writing, and science are thousands of years old. So it becomes very difficult for people to consider learning models other than broadcast. Even though other models may be far more appropriate these days.
The earliest organized religious schools are a few thousand years old. The oldest non-religious universities began nearly a thousand years ago. For centuries, oral learning was the predominant modality in what we would recognize as schools. It wasn’t until the invention of the printing press in the fifteenth century that a significant number of people were able to learn independently from books and newspapers, which are, of course, still a form of broadcast media.
Even though the invention of inexpensive mass-printing revolutionized society, the old broadcast teaching models were sunk so deeply and invisibly into our culture that they have persisted to this day. When you are taught by broadcast by teachers who were taught by broadcast it is not surprising that when you are asked to teach in turn, you employ the same methods. And this ancient cultural conditioning, which we are largely unaware of, is very difficult to break.
As adults, when we create a meeting we are thus naturally primed to choose a broadcast paradigm for the “learning” portions. As a society, we are mostly unaware of our conditioning by past centuries of broadcast learning. And when it is brought to our attention, it is still very difficult for an individual to break away from the years of broadcast process to which he has been subjected as a child.
The process we’ve been using for so long inhibits our ability to consider alternatives. But the quantity of “knowledge” that we currently expect adults to possess also plays a role. This leads us to the second reason why broadcast methodology infuses meetings.
How culture shapes our system of education
For most of human history, learning was predominantly experiential. Life expectancy was low by modern standards and formal education nonexistent. Even after schools began to become important institutions, curricula were modest. In the Middle Ages, formal education of children was rare; in the fifteenth century, only a small percentage of European children learned to read and write, usually as a prerequisite for acceptance as a guild apprentice.
Up until around a hundred years ago, advanced education was only available for a tiny number of students. The expectations for those entering university were laughable by today’s standards. Isaac Newton, for example, received no formal mathematics teaching until he entered Trinity College, Cambridge in 1661. Students didn’t routinely learn algebra, even at university, until the eightieth century. In the Victorian era, secondary school students mastered the “three R’s”—reading, writing, and ‘rithmetic—plus perhaps a few other topics like needlework (girls only), geography, and history.
The drivers of education
The need for jobs has driven education since the birth of apprenticeship programs in the Middle East four millennia ago. Apprenticeship remained the dominant model of education until the advent of the Industrial Revolution when apprenticeship no longer matched the growing needs for workers just-enough capable of handling repetitive work plus some with specialized new trainable skills like bookkeeping and shop work. A period of emphasis on career and technical education ensued. Once formal education became a social and legislative requirement for a majority of children, curriculum wars erupted between the conflicting goals of content and pedagogy. These wars have been with us in some form ever since.
Whatever you think about the relative merits of “traditionalist” and “progressive” approaches to education (see Tom Loveless’s The Curriculum Wars for a good overview), the key cultural reason why broadcast methods remain firmly embedded in our children’s education is the sheer quantity of knowledge that society—for whatever reasons—is determined to cram into young heads during formal education. As the brief history above illustrates, we now require young adults to absorb a staggering diversity and quantity of topics compared to our expectations of the past.
As a result, there is no way to teach this added knowledge experientially in the time available. It took centuries for some of our brightest minds to formulate the algebra that today we routinely teach to eleven-year-olds! While we have probably developed better paths and techniques for sharing this educational content, any increased efficiency in delivery has not kept pace with the massive increase in expected knowledge mastery.
Why meetings perpetuate broadcast education
It is this significant cultural imposition that requires us to use primarily broadcast methods to educate our young in school. The mistake we make is to assume that the broadcast learning we received as kids should continue into adulthood. This is why meetings continue to concentrate on broadcast learning modes. Every one of us is conditioned by an overwhelming exposure to broadcast teaching in our youth.
Receiving specialized adult learning from an expert made sense for human history up until the industrial age. Now that information is moving into systems outside our brains, we have an urgent need to use adult learning modalities that do not concentrate on packing information into our heads. Instead, we’ll find that most of what we need to learn to do our jobs today is based on working informally and creatively with novel problems with solutions that need just-in-time information from our peers.
We find it hard to stop conference lecturing because it’s the dominant learning modality during our formal education before adulthood. Being taught in school, however inefficiently, via lecture about the amazing things humans have created, discovered, and invented indoctrinates us to believe that lecturing is the normal way to learn. That’s why we continue to inflict lecturing on conference audiences. It’s what we’re used to. Sadly, we’re mostly unfamiliar with alternative and more effective learning modalities that are more and more important in today’s world.
Yes, meetings are a mess!
If you’d like to read more about the ideas shared here, and also learn about how to make meetings powerful places for learning, connection, engagement, community-building, and action, check out my book The Power of Participation.
I feel irritated when I see so many event professionals focusing on “new” event technology while ignoring existing technology that, in many cases, could greatly improve their events at a fraction of the cost.
There, I said it.
Every year there are plenty of conferences where you can go and see the latest and greatest mobile and gamification apps, attendee tracking systems, registrant analytics, mobile networking, video streaming platforms, etc. Vendors are happy to sponsor these events. They use them to showcase their wares and, hopefully, convince attendees that their new technology is worth buying.
Let me be clear—I have nothing against new technology per se. (If I was I’d be a hypocrite, given that I spent twenty-three profitable years as an information technology consultant.) What’s sad is that too much of event professionals’ limited continuing education time is spent investigating shiny new toys and apps while overlooking inexpensive and proven ways to provide effective learning, connection, engagement, and community building at their events.
Why does this happen? Here are two reasons:
We fixate on the new
“Technology is anything that was invented after you were born.”
—Alan Kay, from a Hong Kong press conference in the late 1980s
We are enveloped by so much rapidly changing technology that we fixate on what is new.
What was new quickly becomes taken for granted and largely invisible. As David Weinberger remarks: “Technology sinks below our consciousness like the eye blinks our brain filters out.”
Although technology in the form of human tools has existed for over three million years and we’ve had books for over half a millennium, the first history of technology wasn’t written until 1954. Flip charts, 5×8 cards, comfortable seating, room sets, healthy food and beverage, and hand voting have been around for a long time. They are old-fashioned technology to event professionals, so we don’t pay them much attention (unless they can be reframed in a sexy way, e.g. “brain food”). But that doesn’t mean they’re not important. Far from it.
Technology isn’t just manufactured goods and software
Our definition of what is and isn’t “technology” is far too narrow. We tend to think of technology in terms of products and embedded implementations (e.g. software). But this is an incredibly restrictive viewpoint. Kevin Kelly, in his thought-provoking book What Technology Wants, lists three of the most important human technologies:
Language: A technology that “shifted the burden of evolution in humans away from genetic inheritance…[allowing] our language and culture to carry our species’ aggregate learning as well.”
Writing: A technology that “changed the speed of learning in humans by easing the transmission of ideas across territories and across time.”
Science: “The invention that enables greater invention.”
Once we start thinking about technology with a wider lens like this, all kinds of possibilities arise.
Language, writing, and science are outside our conventional, narrow-scope technology. The conventional technology we use to instantiate the sounds, symbols, etc. that they use is secondary. Language, writing, and science are primarily about human process.
When we expand our perspective on event technology to include process, many unexamined aspects of our events come into view. A few examples:
Why do we open conferences with a keynote?
Why do so few people speak during conference sessions?
How do we know if the sessions we’re providing are what attendees actually want?
Why do we provide entertainment during socials?
Are socials the best way to meet other attendees?
Why do we close conferences with a keynote or dinner?
When you start honestly investigating issues like these, instead of simply repeating things the same “safe” way you’ve previously experienced at conferences you’ll discover all kinds of human process technology that can fundamentally improve your event in ways that a new gizmo or app cannot.
So I urge every event professional to re-envisage event technology to include the process used during your events. Concentrate less on improving logistical processes: registration, decor, A/V, F&B, and so on. These are secondary processes, and we know how to do them well. Instead, focus on improving the human process you use throughout the event venue and duration. How you