In 2017, I purchased an Apple Watch. It has improved my life in many ways. In particular, it’s become an essential tool for supporting my desire to exercise daily. The watch’s Workout app tracks my exercise. All I need to do is to tell it what kind of exercise I’m about to start, and leave the app running until the exercise is over.
To pick the right exercise, the watch shows a scrollable list. Here’s what I saw today when I tapped the app:
Right now I’m living at home, and the two workouts I do most often are my daily outdoor run and yoga. So it’s convenient that these options are the first two I see.
The Workout app learns my preferences, and adjusts its display to show me the most likely workouts first.
My environment changes
Almost every year, I vacation in Anguilla, typically for three weeks. My exercise program there is different. I don’t run (it’s too hot for me!) but I walk daily, followed by a pool swim.
After a few days, the Workout app unlearns my most common home-based exercises and relearns my new routine, replacing the top two items on the Workouts list with the Outdoor Walk and Pool Swim choices.
For the remainder of my vacation, these two options stay at the top of the list.
Alas, all good things come to an end. On returning home, the Workout app unlearns my Anguilla routine and relearns my home routine.
And if my exercise regime changes over time, due to circumstances or location, the Workout app will continue to use its learn-unlearn-relearn routine to display the most likely choices first.
I’m sure that Apple has incorporated other examples of unlearning into its products, but this is one I’ve noticed. Small thoughtful touches like this have helped Apple products and services become market leaders in a very competitive industry.
#2 Apple Mail
Apple doesn’t always get things right, unfortunately. Apple’s Mail program provides a classic example of what happens when unlearning is not an option.
Apple Mail allows you to file messages in folders, a useful way for me to organize the 94,000 emails I currently store. Trying to be helpful, the program learns where you tend to store specific kinds of messages, and after a while, right-clicking a message will pop up an option to move it to the “learned” preferred folder.
This is a generally helpful feature — except…
Once Apple Mail has “learned” where to file an email, it won’t unlearn that choice!
Furthermore, there’s no way to manually reset Apple Mail’s choice!
For example, let’s say you’ve been working with Marce, a client’s employee, for some time, so you’ve been moving Marce’s emails to a folder for that client. After a while Apple Mail helpfully offers to move emails from Marce to that client folder. So far, so good. Then Marce moves to a new company, and you continue to work with them. Now you’d like to file Marce’s emails in a separate folder for the new client. Unfortunately, no matter how many times you manually file Marce’s emails in the new client’s folder, Apple Mail will forever continue to suggest moving them to the former employer folder!
You will have to move email from Marce to the new employee folder manually every time, remembering every time not to choose the (wrong) default Apple Mail continues to suggest.
This is a drag, and a product flaw.
It surprises me that the Watch software incorporates learn-unlearn-relearn into its memory-limited program space, but Apple Mail on the desktop, where program size is not an issue, only includes the learn piece.
I’ll conclude with a few observations about the wider value of unlearning in organizations.
Most organizations need to innovate constantly, due to changing circumstances. Innovation doesn’t just involve coming up with new ideas. Innovation also requires a willingness and ability to cannibalize or destroy existing products or services; i.e. to unlearn what used to work, and relearn what is now relevant.
Here’s how to trash your brand. If I could completely avoid flying American Airlines I would. Not because of the airline’s mediocre rankings in on-time arrivals, lost baggage, fees, and customer satisfaction. After all, there are some airlines that are even worse. (Spirit, I’m looking at you.)
No, it’s their infuriating habit of pitching credit cards to passengers on every flight. For example, while I was trying to sleep on the red-eye I took last week.
I find the two- to three-minute pitches really annoying. We are literally a captive audience, strapped into our seats with nowhere to escape.
It’s awful on @AmericanAir. The airline claims flight attendants are there primarily for our safety. Except when they’re obnoxiously hawking credit cards multiple times over the PA during the same flight.
Besides annoying the heck out of me, I’m at a loss to understand how this is a good business decision. —Is the revenue they receive when some hapless passenger signs up a significant boost to their bottom line?
—Are flight attendants so eager to supplement their salaries (apparently, they get ~$50 for every new customer) that they beg the airline to add extra work to their flight duties?
—And, most importantly, does American Airlines think that pitching their credit card on every flight to captive passengers improves their brand?
After all, this survey found that over 90% of airline passengers said they’d never apply for a credit card in flight. (And, of course, some of those who would have already got one—yet still have to put up with the same spiel on every subsequent trip!)
A creative alternative
Even if American Airlines truly believe that hawking credit cards to a captive audience is a good thing, they don’t have to do it in a way that annoys almost everyone on the airplane. Edward Pizzarello notes that United Airlines also pitches cards on their flights, using a classic marketing technique that is far less intrusive and, I suspect, far more effective.
After over thirty years working with organizations, I think that it’s possible to change organizational culture. But it’s far from easy!
First, many organizations are in denial that there’s any kind of problem with their culture. Getting leadership to think otherwise is an uphill or hopeless battle.
Second, if an organization does get to the point where “we want to change our culture”, there’s rarely an explicit consensus of what we “need” or “might” change.
Third, culture is an emergent property of the interactions between people in the organization, not a linear consequence of deeply buried assumptions to challenge and “treat” in isolation. Prescriptive, formulaic approaches to culture change, are therefore rarely if ever successful.
Finally, organizational culture self-perpetuates through a complex web of rules and relationships whose very interconnectedness resist change. Even if you have a clear idea of what you want to do, there are no uncoupled places to start.
“Culture is an emergent set of patterns that are formed from the interactions between people.These patterns cannot be reverse engineered. Once they exist you need to change the interactions between people if you want to change the patterns.” —Chris Corrigan, The myth of managed culture change
This is why process tools like those included in The Power of Participation are so important. Imposed, top down culture change regimes attempt to force people to do things differently. Chris describes this process as “cruel and violent”. Participation process tools allow people to safely explore interacting in new ways. Organizations can then transform through the resulting emergent changes that such tools facilitate and support.
Image attribution: Animated gif excerpt from “Lawyers in Love” by Jackson Browne
Broadcast is the hundreds-of-years-dominant paradigm for sessions, conferences, and meetings. Most of the time, one person presents and everyone else listens and watches. Why?
“Things are the way they are because they got that way.” —Quip attributed to Kenneth Boulding
I think there are two principal historic reasons: one shaped by technology, the other by culture.
How technology shapes our system of education
Perhaps you’re thinking: Technology? Isn’t technology a relatively recent development? How could technology have influenced how we learnt hundreds of years ago?
To answer these questions, let’s take a journey back in time. It’ll take a while, but stay with me! I’ll shine some light on some rarely-examined foundations of our current educational paradigm.
Understandably, we tend to think of technology these days as material devices like cars, printers, and smartphones or, increasingly, as computer programs: software and apps. But this is an incredibly restrictive viewpoint. Such a definition of what is and isn’t “technology” is far too narrow.
What is “technology”?
“Technology is anything that was invented after you were born.” —Alan Kay, at a Hong Kong press conference in the late 1980s
An older reader will immediately recognize a typewriter, but a child might stare in puzzlement at a 1945 Smith-Corona Sterling. A device found on a table at a yard-sale appears to be a piece of rusty sculpture until a Google search reveals it’s a ninety year-old cherry stoner. By Alan Kay’s definition, anything made after you became aware is technology. Anything that’s really old, we don’t even recognize as technology!
This worldview exists because human beings are incredibly good at adapting to new circumstances. Such an ability greatly increases our chances of surviving a hostile and treacherous world. But there’s a downside. When we start making changes to our environment by making useful things, what was once new becomes a part of our everyday existence. In the process, what was formerly new becomes largely invisible to our senses, focused as they are on the new and unexpected. As David Weinberger remarks: “Technology sinks below our consciousness like the eye blinks our brain filters out.”
A wider definition of technology
So let’s adopt a wider definition of technology and see where it takes us. I’ve been influenced here by Kevin Kelly, in his thought-provoking book What Technology Wants.
Technology is anything made to solve a problem. —Adrian’s definition, a paraphrase of Wikipedia’s definition of technology
This definition is useful because it opens our eyes to technology that we have been using for a very long time.
Science, writing, and language
For example, by this definition, science is technology! Science is just a way that we’ve invented to understand the patterns we notice in the world we live in.
Science is old. Writing is older; it allows us to communicate asynchronously with each other.
Writing is technology!
And oldest of all—we don’t really know how old—language is technology. Every culture, tribe has its own language it has invented to solve the problem of real-time communication between its members.
These technologies are so old that they are invisible to us. They are part of our culture, the human air we breathe. Language, writing, and science are tools outside our conventional, narrow-scope view of technology. We instantiate these tools using invented conventions: sounds, gestures, and symbols. These sounds, gestures, and symbols, however, are secondary features of these ancient technologies. Ultimately, language, writing, and science are primarily about human process.
Human process technology
Human process has become the most invisible technology. It is inexorably and continually built into every one of us by our culture, starting the moment we are born, before we can speak, write, or reason. Our culture teaches us throughout our life the signs, sounds, and movements that signify. We are superbly equipped to learn to speak, write, and think before we have any self-awareness of what we are being taught.
“We seldom realize, for example that our most private thoughts and emotions are not actually our own. For we think in terms of languages and images which we did not invent, but which were given to us by our society.” —Alan Watts, The Book on the Taboo Against Knowing Who You Are
Our awareness of the processes we constantly use to learn and make sense of the world and to connect with others is minimal. It’s like breathing, largely automatic and unconscious. As a result, the old process technology that we adopted for practical purposes long before recorded history continues to shape our lives today.
Think for a moment about the impact of language on our species. Before language arose, we had no way to transfer what we learned during our all-too-brief lives to our tribe and following generations. “These plants are safe to eat.” “You can make a sharp spearhead from this rock.” “Snakes live in that cave.” Every individual had to painfully acquire such learning from scratch. Language allowed parents and tribe elders to pass on valuable knowledge orally, improving survival and quality of life
Similarly, the later development of writing made it possible to share, physically transfer, and expand a permanent repository of human knowledge. And the evolution of the process methodology of science enabled us to design experiments about our world, codify the patterns we discovered, and turn them into inventions that transform our lives.
The effect of technology on education
Now we’re ready to consider the effect of the historical development of language, writing, and science on education. For almost all of human history, language was our dominant mode of communication and our single most important educational tool. If you wanted to learn something you had to travel physically to where someone knew what you needed to learn and they would then tell it to you. Eventually schools developed: establishments for improving the efficiency of oral communication of information by bringing many students together so they could learn simultaneously from one teacher.
Language reigned supreme for millennia, thus becoming an invisible technology. Only when writing became established it was finally possible to asynchronously transmit information. By that time, the model of the single teacher and multiple students was buried deep in our collective psyche and, to a large extent, the book paradigm mirrored the language process since most books were written by a single expert and absorbed by a much larger number of readers.
(The very word lecture beautifully illustrates the adoption of old models that took place during the development of writing. The word is derived from the latin lectūra which means—to read! The first books were so rare that a group who wished to study a book’s content would have someone read the book out loud while the others copied down what they heard.)
Even science started as an individual enterprise. The early study of “natural philosophy” by Socrates, Aristotle, and others used an oral teacher-students model. Although science today is largely an intensely cooperative enterprise, we still see considerable leftovers of the older invisible technologies in its societal organization: prescribed progressions towards mastery of fields, formal paths to tenure, the format of academic meetings, etc.
The effects of invisible technologies
What are the effects of these powerful invisible technologies on our educational archetypes? Technologies like language, writing, and science are thousands of years old. So it becomes very difficult for people to consider learning models other than broadcast. Even though other models may be far more appropriate these days.
The earliest organized religious schools are a few thousand years old. The oldest non-religious universities began nearly a thousand years ago. For centuries, oral learning was the predominant modality in what we would recognize as schools. It wasn’t until the invention of the printing press in the fifteenth century that a significant number of people were able to learn independently from books and newspapers, which are, of course, still a form of broadcast media.
Even though the invention of inexpensive mass-printing revolutionized society, the old broadcast teaching models were sunk so deeply and invisibly into our culture that they have persisted to this day. When you are taught by broadcast by teachers who were taught by broadcast it is not surprising that when you are asked to teach in turn, you employ the same methods. And this ancient cultural conditioning, which we are largely unaware of, is very difficult to break.
As adults, when we create a meeting we are thus naturally primed to choose a broadcast paradigm for the “learning” portions. As a society we are mostly unaware of our conditioning by past centuries of broadcast learning. And when it is brought to our attention, it is still very difficult for an individual to break away from the years of broadcast process to which he has been subjected as a child.
The process we’ve been using for so long inhibits our ability to consider alternatives. But the quantity of “knowledge” that we currently expect adults to possess also plays a role. And this leads us to the second reason why broadcast methodology infuses meetings.
How culture shapes our system of education
For most of human history, learning was predominantly experiential. Life expectancy was low by modern standards and formal education nonexistent. Even after schools began to become important institutions, curricula were modest. In the Middle Ages, formal education of children was rare; in the fifteenth century only a small percentage of European children learned to read and write, usually as a prerequisite for acceptance as a guild apprentice.
Up until around a hundred years ago, advanced education was only available for a tiny number of students. The expectations for those entering university were laughable by today’s standards. Isaac Newton, for example, received no formal mathematics teaching until he entered Trinity College, Cambridge in 1661. Students didn’t routinely learn algebra, even at university, until the eightieth century. In the Victorian era, secondary school students mastered the “three R’s”—reading, writing and ‘rithmetic—plus perhaps a few other topics like needlework (girls only), geography and history.
The drivers of education
The need for jobs has driven education since the birth of apprenticeship programs in the Middle East four millennia ago. Apprenticeship remained the dominant model of education until the advent of the industrial revolution, when apprenticeship no longer matched growing needs for workers just-enough capable to handle repetitive work plus some with specialized new trainable skills like bookkeeping and shopwork. A period of emphasis on career and technical education ensued. Once formal education became a social and legislative requirement for a majority of children, curricula wars erupted between the conflicting goals of content and pedagogy. These wars have been with us in some form ever since.
Whatever you think about the relative merits of “traditionalist” and “progressive” approaches to education (see Tom Loveless’s The Curriculum Wars for a good overview), the key cultural reason why broadcast methods remain firmly embedded in our children’s education is the sheer quantity of knowledge that society—for whatever reasons—is determined to cram into young heads during formal education. As the brief history above illustrates, we now require young adults to absorb a staggering diversity and quantity of topics compared to our expectations of the past.
As a result, there is no way to teach this added knowledge experientially in the time available. It took centuries for some of our brightest minds to formulate the algebra that today we routinely teach to eleven-year-olds! While we have probably developed better paths and techniques for sharing this educational content, any increased efficiency in delivery has not kept pace with the massive increase in expected knowledge mastery.
Why meetings perpetuate broadcast education
It is this significant cultural imposition that requires us to use primarily broadcast methods to educate our young in school. The mistake we make is to assume that the broadcast learning we received as kids should continue into adulthood. This is why meetings continue to concentrate on broadcast learning modes. Every one of us is conditioned by an overwhelming exposure to broadcast teaching in our youth.
Receiving specialized adult learning from an expert made sense for human history up until the industrial age. Now that information is moving into systems outside our brains, we have an urgent need to use adult learning modalities that do not concentrate on packing information into our heads. Instead, we’ll find that most of what we need to learn to do our jobs today is based on working informally and creatively with novel problems with solutions that need just-in-time information from our peers.
We find it hard to stop conference lecturing because it’s the dominant learning modality during our formal education before adulthood. Being taught in school, however inefficiently, via lecture about the amazing things humans have created, discovered, and invented indoctrinates us to believe that lecturing is the normal way to learn. That’s why we continue to inflict lecturing on conference audiences. It’s what we’re used to. Sadly, we’re mostly unfamiliar with alternative and more effective learning modalities that are more and more important in today’s world.
Yes, meetings are a mess!
If you’d like to read more about the ideas shared here, and also learn about how to make meetings powerful places for learning, connection, engagement, community-building, and action, check out my book The Power of Participation.
There are many models of how people behave in groups, and each of them is useful in certain contexts. In the context of organizing and running a conference, I tend to employ an organic model, in which group members are seen in terms of their uniqueness, rather than categorized by their roles. An organic point of view allows and encourages people to find ways to work together in a variety of complex situations, and leads toward problem-solving that benefits everyone.
For example a conference steering committee I coordinated was offered the option of engaging a well-known, desired keynote speaker for a conference to be held in six months. Initially, his appearance fee was more than our budget could handle, but at the last minute he suggested appearing virtually, giving his presentation on a large video screen, at an affordable fee. We needed to quickly find out whether the conference site could support a virtual presentation.
If we had been using a linear approach to group organization, we would have already chosen the steering committee member responsible for technical issues and it would be her job to resolve this issue. If she were busy or sick, I’d have had to poll the other committee members for help and ask someone to take on additional work. In this case, our committee was comfortable with an organic approach, so I sent a request for help to all the steering committee members, most of whom had some technical expertise.
Because the committee culture was one of staying flexible in the face of unexpected circumstances, cooperatively working together to solve problems, and respecting each member’s unique constraints and contributions, I didn’t worry about treading on anyone’s toes by sending out a general request for help. The outcome: One of the committee members had some free time and immediately offered his expertise, while another, the speaker liaison, told us he thought the speaker would have the information we needed and would check with him.
How do you build this kind of culture for your conference organizing team? This brings us to the question of what leadership means in the context of organizing and running a conference. Every book on leadership has a different approach; here’s what fits for me.
Author and polymath Jerry Weinberg describes organic leadership as leading the process rather than people. “Leading people requires that they relinquish control over their lives. Leading the process is responsive to people, giving them choices and leaving them in control.” Jerry’s resulting definition of leadership is “the process of creating an environment in which people become empowered.” This is what I try to elicit when working with a conference organizing team.
I also find Dale Emery’s definition of leadership helpful. Dale describes leadership as “the art of influencing people to freely serve shared purposes.” Bear this definition in mind as you work with your conference organizing team. It ties your interactions with them to your shared goal of realizing a vision, in this case organizing and running of a conference.
Who on the team leads in this way? Unlike the traditional, role-based version of leadership, any member can help build an atmosphere that supports this kind of leadership. Once the seeds of this culture are established, I’ve found that it tends to become self-perpetuating. People like working together in this way. Experiencing a conference team coming together, with the members enjoying their interactions while creating a great event, is one of the most satisfying aspects of my work.
Although the impetus for an organic approach can come from any team member, the conference coordinator is the natural initiator of these flavors of leadership. She is responsible for keeping the conference planning on track and avoiding planning and execution snafus. She does this, not by ordering people around, but through a respectful flow of timely reminders, check-ins, questions, requests for assistance, and appropriate redirections.
Some people have little experience working organically. They may join your team with the expectation that their responsibilities will be determined by others, that a team leader will give them well-defined jobs to do. Often, given a relaxed and open environment where their ideas are encouraged, they will grow into a more active role as they become more confident in their ability to contribute creatively and flexibly to the needs of organizing and running the conference.
A helpful reminder for leaders of every kind Jerry Weinberg suggests you assume that everyone you’re working with wants to feel useful and make a contribution. He quotes Stan Gross’s device for dealing with his feelings that people are not trying to contribute: “They’re all doing the best they can, under the circumstances. If I don’t think they are doing the best they can, then I don’t understand the circumstances.”
Such a mindset will help you focus on finding solutions to people problems that inevitably arise in any group working together on something they care about.
How do you see event planning leadership? Is your model different? What can you add to these ideas?
How do you facilitate change? In this occasional series, we explore various aspects of facilitating individual and group change.
The devolution of the British roundabout
I grew up in England, where roundabouts are more common than traffic lights as a way of routing traffic. (Fun fact: Hertfordshire, England, built the first roundabout in the world in the early 1900’s.) When I was a kid, roundabouts were walled constructions in the center of traffic circles. They looked like this:
In the 1960’s, the Brits realized that such elaborate constructions were overkill, and roundabouts became more like this:
As time went on, roundabout design was simplified further to this minimalist design:
These so-called mini-roundabouts were smaller than previous versions because they allowed the rear wheels of large vehicles to drive over the edge of the central circle when making tight turns.
While in London last year I saw the most recent evolution of the British roundabout. The physical barrier of the central island has completely disappeared, and the roundabout has just become a simple painted circle, with directional arrows painted on the road surface.
Four lessons we can learn about facilitating change from this brief history of roundabouts
When we are facilitating a desired change, we need to communicate clearly the change we want to make
Think about what would have happened if the initial replacement for the street intersections that human cultures have used for thousands of years was a plain white circle. People would not have understood how it was supposed to function. Without a physical barrier forcing a circular route, drivers would have been tempted to drive straight across it. The first roundabout design had to impose a fundamentally different way of navigating intersections; otherwise it wouldn’t have worked.
It’s easier to facilitate change in small increments than in large leaps
By the time of the plain white circle roundabout, the concept of driving around, rather than through, circular objects placed at the center of intersections was imprinted on the British drivers’ psyche. The final design is quite different from the elaborate early roundabouts, but it developed through a series of incremental design refinements.
Change is attractive if the new situation has advantages
Each change in the design of the British roundabout created advantages for the builders (less expensive), environment (less space wasted at intersections), and users (more space to negotiate the intersection). While the promise of an improved outcome does not guarantee that a change will occur, it certainly can’t hurt.
Different cultures can have very different approaches to change
If you’re not British, your experience of roundabouts will be different; you may not even know what a roundabout is! In the United States, roundabouts only started appearing in the 1990’s (rotaries and traffic circles employ different rules). Other European countries have their own roundabout designs and unique histories of introduction. Don’t assume that a change that has worked for one culture will be acceptable to another. (A corollary to this lesson is that exploring other cultures is a wonderful way to have your eyes opened to aspects of your culture that you take for granted.)
Are there other lessons we can learn from the British roundabout? What other design evolutions can you think of that teach lessons about facilitating change?