How to Protect Yourself from Unintended Consequences

Jake Wilder
7 min readNov 27, 2018

Why the Silver Rule beats the Golden Rule

Photo by Yosh Ginsu on Unsplash

“When we try to pick out anything by itself, we find it hitched to everything else in the universe,” said Sierra Club co-founder John Muir, describing the difficulty in changing one aspect of a system without it also creating a cascade of unintended consequences in places we didn’t anticipate.

On Sunday April 14, 1912, the RMS Titanic struck an iceberg and claimed the lives of over 1500 people. Afterwards, one of the primary reasons given for the disaster was an insufficient quantity of lifeboats. Despite totaling 2208 people onboard, the Titanic was only equipped with 1178 lifeboat seats.

It’s worth noting that the Titanic was legally only required to have 962 lifeboat seats, and due to many lifeboats leaving with less than full capacity, 472 lifeboat seats of the 1178 available went unused. Also worth noting that despite the captain’s orders that women and children were to board first, one Daniel Buckley disguised himself as a woman to get aboard and take one of the available seats. Also worth noting that only one man, Charles Joughin, survived the 31 degree water temperature…he reportedly had been drinking heavily.

Yet lifeboat seat availability was a clear concern so in response to the Titanic disaster a new federal Seaman’s Act required all ships to be equipped with enough lifeboats to account for every passenger.

Shortly thereafter, the SS Eastland, a passenger ship capsized in Chicago Harbor, killing 848 passengers and crewmembers. The main cause? The additional lifeboats, added in response to the Seaman’s Act regulation, made the ship too unstable and top-heavy to support a full passenger load.

The Eastland disaster demonstrated a fact that administrators and regulators tend to forget yet those who actually create and execute work inherently know: universal behavior is great on paper, but disastrous in practice.

Unintended or Unanticipated Consequences

“The law of unintended consequences pushes us ceaselessly through the years, permitting no pause for perspective.” — Richard Schickel

We’ve become too comfortable blaming failures and disasters on unintended consequences.

Few people ever intend to cause a major issue. I’ve never heard someone describe a disaster as an intended consequence of sound engineering and quality decision-making.

Unintended consequences have become a scapegoat for poor planning. They’re a catch-all for everyone who implements policy and regulations without the foresight to identify and account for future risk. And when we lean on this crutch, we’re more likely to blame external events and less likely to reflect on our decision-making process, setting our future selves up to repeat similar mistakes.

And yet few issues that we attribute to unintended consequences are ever truly unavoidable.

While India was under British rule, the government was concerned with the number of venomous cobra snakes in Delhi. So, in an effort to enlist the city’s residents in culling the population, they offered a bounty for every dead cobra.

Initially this cut down the numbers. But soon enterprising people realized they could breed cobras for the income. The government eventually realized what was happening and canceled the bounty. The cobra breeders, then faced with a stock of now-worthless snakes, elected to set them free in the city.

The British government’s initiative, failing to recognize how people would ultimately respond to incentives, made the situation worse as the wild cobra population ultimately increased because of their intervention.

In another example, Australians introduced rabbits to their continent as a source of food, only to see them multiply out of control and become a major pest, wreaking havoc on the continent’s agriculture.

Committed to avoid learning from this mistake, Australians later introduced cane toads to control beetles that were ruining their sugar cane crop. Unfortunately, cane beetles tend to live at the top of the sugar cane and cane toads aren’t especially adept at climbing. Instead of controlling the can beetle population, the toads multiplied exponentially and become a major pest of their own right.

A more lightheared example includes AOL’s implementation of a profanity filter in 1996. A good idea in theory, yet it also prevented residents of Scunthorpe, England from creating accounts.

These issues weren’t unforeseeable. They weren’t unavoidable. They’re just evidence of a lack of foresight in decision-making. And the less foresight, the worse the consequences. As Stephen Toblowsky wrote in The Dangerous Animals Club,

“Any endeavor has unintended consequences. Any ill-conceived endeavor has more.”

Yet there are solutions. There are many ways to expand our circle of competence and reduce the biases that blind us to future risk. Yet there’s one simple check that often gives the best protection: Ask, are we making things simpler or more complex?

Simple or Complex? Subtraction or Addition? Silver or Gold?

“It is vain to do with more what can be done with less.” — William of Occam (1300–1350), originator of Occam’s Razor

Everyone, at one point or another, has had a well-meaning parent, the Gospel of Matthew, or a particularly memorable Berenstain Bears book lecture them about the golden rule — Do unto others as you would have them do unto you.

Yet the Golden Rule can be somewhat presumptuous — and in that presumption — risky. How are we supposed to know what’s good for everyone else? Most of us are lucky if we can recognize what’s in our own best interest, let alone diverse groups of other people.

Yet most of our decisions, particularly in response to a crisis, forget this blindspot. We try to layer on more precautions, rules, and procedures to tell people exactly what to do. All in the hopes of directing them to be successful. And more often than not, preventing that very outcome.

The alternative would be to remember that while the Golden Rule helped Sister Bear empathize with Suzy MacGrizzie, it often falls short of setting robust policy.

Instead, the Silver Rule offers much less presumptive guidance — Do not do unto others as you would not have them do unto you.

A minor difference with far-reaching impacts. Instead of asking us to prescribe someone’s future behavior, it invites us to look for obstacles to remove.

This may seem like two sides of the same coin, but it’s often much easier to recognize which actions will have a negative impact than those we expect to have a positive one. Mainly because we can much more easily see what is bad than what is good. As the French pilot and author of Le Petit Prince, Antoine de Saint-Exupery put it,

“Perfection is not when there is no more to add, but no more to take away.”

Unfortunately, most reactionary decisions fail to recognize this concept. When people are asked to develop solutions that prevent a recurrence, they default into adding more layers of protection. Because nothing says “it can’t happen again” like layers of regulations and rules dictating exact behaviors.

All of which makes perfect sense when you remember that those setting policy aren’t working towards actual long-term benefit, but the perception of long-term benefit right now.

So if we want to change this behavior, we need to change that incentive.

Give People Skin in the Game

“Things designed by people without skin in the game tend to grow in complication (before their final collapse).” — Nassem Talib, Skin in the Game

Most of the people setting policy are not the same people who need to actually implement it. Just as the majority of those who write procedures and processes aren’t the same people who will need to execute them.

Given this situation, we need to ask whether we are encouraging people to develop policies and processes that are simple or complex. Which alternative speaks more to the difficulty and complexity of their position?

An engineer who develops a complicated design solution gives herself added job security. A manager who develops a complex recovery strategy over-inflates her value to the organization.

Additional layers are easy to justify and difficult to disprove. Especially following a crisis, people want to see the added benefit of more protection.

Yet regulations, once in, tend to hang around well past their expiration date. Few people are willing to cut rules and prescriptions for fear of triggering the next big issue. And as a result we find ourselves choked in bureaucratic red tape and unnecessary complications.

And add greater risk of downstream unintended consequences.

The alternative is to stop separating those who make decisions from those who implement them. People who will be responsible to execute a policy are very motivated to make it as simple as possible.

Push down these decisions to the level of those who will be responsible for their success. Or if that’s not practicable, connect the decision-makers with those who will be directly impacted. Have them participate in qualification tests. Make them gain real-time feedback. Tie their rewards to long-term execution.

Incentivize simplicity and ability to execute over that initial perception. Give people skin in the game. And while we’ll never fully avoid unintended consequences, we can at least cut back on those unanticipated ones. And stop using them as a crutch to avoid being held accountable for our decisions.

Thanks, as always, for reading. If you enjoyed this or have any suggestions, please let me know your thoughts. I’d love to hear from you. And if you found this helpful, I’d appreciate if you could clap it up👏 and help me share with more people. Cheers!

--

--

Jake Wilder

I don’t know where I’m going. But at least I know how to get there.