23

Breaking New Ground - David Rostcheck

Embed Size (px)

Citation preview

1

Brink’s, Incorporated

Published by the Office of Strategic Services

Breaking New Ground

Managing the Dynamics of Risk

2

“Take calculated risks. That is quite different from being rash.” - George Patton

Introduction - Adventure and Risk

Every team of adventurers that start up a mountain expects to return.

History shows clearly that not every team does. It is in this gap between

expectation and reality that the dynamics of risk play out.

Risk is an inherent part of adventure

All important projects are adventures. In them we break new ground,

taking an organization to a position it has never occupied before. Risk comes

inherently commingled in adventure. In this context, we recognize risk

management as an intrinsic aspect of any successful project.

The term "risk management" seems staid and boring, the domain of

actuaries and accountants. But when we recognize risk as an essential

element of adventure, of breaking new ground, we see things differently. We

realize then that every record-breaking transcontinental flight, every space

launch, every voyage of exploration - indeed, any successful project at all -

succeeds or fails based on preparation and risk management. When

confronted with a rapidly evolving situation beyond the boundary of the plan,

the team’s response becomes critical. This is the point where the mission

either stays under control or slides over the edge.

This publication presents an overview of the manner in which project risks

manifest themselves. Exploring these topics helps to form the necessary

perspective that we need to successfully navigate the inevitable unforeseen

challenges that arise when we engage the unknown.

3

4

“No battle plan ever survived contact with the enemy.”- Dwight Eisenhower

A Brief Study of Catastrophe

The project team and the business leaders were equally excited. The new

technology had been demonstrated working successfully in a production

environment. The technology worked, the business processes worked, the

users loved the product, and the pilot site showed significant cost savings. All

the hard problems had been solved; all that remained was to deploy the

technology throughout the organization. What could go wrong?

A year later, the project lay in disastrous ruin. Its leaders were humiliated,

their careers drawing to untimely ends. A frayed business team faced off

against a burned-out technical team, strafing at each other across the

landscape of a failing rollout. The project never made it past the third

deployment site and would eventually be considered a total failure. The

deployed sites became a support quagmire which would take years for the

organization to withdraw from.

This scenario described above is drawn directly from a real project within

Brink’s named ADGP. Its trajectory, while dramatic, is not uncommon.

What happened? How could a project on which all the hard problems had

been solved disintegrate so badly? To answer these questions, we need to

explore the nature and character of risk – to separate it from its context and

look at the topic of risk itself, and how we perceive it – or do not.

5

6

“A whole stream of events issues from the decision, raising in one's favor all manner of unforeseen incidents, meetings and material assistance,

which no man could have dreamt would have come his way.”- William Hutchinson Murray

The Unknown Unknown

It is in new territory - in the space beyond the comfortable, beyond the

known - where we evolve our capability. But by definition, any voyage that

expands our capacity must take us into new territory, and any unknown

territory involves risks that we do not know. It is our mission as adventurers

to explore, engage, and conquer these risks, and to return the victorious

masters of our new territory. As our first step in doing so, we must remind

ourselves constantly that in this terra nova,

We do not know what we do not know

Or, to put it differently,

We do not understand this problem

We often bristle at the second sentence. Certainly we understand what we

are doing, do we not? Are we not skilled professionals?

Our emotional response blurs the basic underlying truth - we are in

strange waters and, skilled as we may be, we do not know this place. If we

understood the problem, if we knew the territory, then it would not be frontier

and we would not be out here seeking new-found reward. No, the terrain

around us is new, and even when it looks familiar - especially when it looks

familiar - we must expect to be surprised.

It is easy to agree intellectually with the above reflection. But attaining a

mindset where we actually do expect the unexpected turns out to be very

difficult to achieve. Blind spots, such as believing we fully understand a

problem that we do not, are built into our very cognitive structure. The

7

techniques of risk management seek to catalog these inherent blind spots and

learn to compensate for them.

8

“Unfortunately, there’s no such thing as a beginner’s mountain.”-Laurence Gonzales

Learning to Fear the Familiar

The assumption that we understand the terrain is the shoal upon which

projects falter with astonishing regularity. Seen from a distance, the reasons

appear obvious. Territory that looks familiar is not really familiar, and when it

suddenly acts in an unfamiliar way, our ready reaction is the wrong one. As

we climb the mountain, the ice below us suddenly acts differently than it did a

mere hundred feet below. Although we may have climbed dozens of

mountains, each one is different. Our general knowledge is helpful, but we

need to remind ourselves continually:

We do not understand this problem

It is a new problem, and although it may be quite similar to others that we

have faced in the past, it will be different, often in ways we may not recognize

at first.

It is in this very familiarity where the seeds of project disintegration are

often sown. Studies of casualties in wilderness adventure show a surprising

result - most fatalities occur over and over where the situation appears

familiar and unthreatening. Adventurers survive their challenging white-water

9

run only to later drown bathing in water that looks placid, but has a deceptive

undercurrent.

Most adventurers killed by animals in Yellowstone National Park are killed

by bison. Hikers retreat from the feared grizzly bear and other predators, but

do not fear herd animals. Yet while they are not predators, bison are actually

quite dangerous. Our experience with other domestic herd animals betrays

us. Breaking new ground involves a constant battle with ourselves to

remember this rule:

Fear the familiar

It is not really familiar. When we break new ground, we are in unknown

territory; situations may look familiar but we must remember that they are

really not.

The Two Data Point Fallacy

One day a guest lecturer in a university mathematics department held a

talk about a subject called “Design of Experiments”, a way of selecting sample

points for an experiment so as to get the most information from it. The

lecturer described sampling two different points and then asked how much

information could be known from that. Attendees proposed various theories.

Finally a young undergraduate who was working in a lab for the summer

raised his hand.

“None,” he answered.

“Why not?” asked the lecturer.

10

“Because you’ve only sampled each point once,” the undergraduate

replied. “You don’t know that your equipment is even measuring the same

data consistently until you go back and resample them later.”

The lecturer smiled. “Finally, an experimentalist,” he said.

Our strong urge to build a model immediately once we have some data

often masks questions about the real informational content of the data. This

urge is hard to resist, but can lead a project astray, especially when good

results at early test sites are not indicative of later sites. Proof-of-concept

sites are often powerfully motivated to praise a project; conversely, projects

are generally tailored to the requirements of the first customers who use the

product. Your first customers may not end up being your representative

customers.

In the case of the catastrophic project discussed earlier, the fact that the

pilot site had gone so well led the team to conclude that the system met the

business needs well. In actuality it did meet the business needs of the pilot

site well, but fit less well with the next sites, whose business processes were

different. Conversely, bad results in early pilot sites could condemn a solution

that might actually work well for the majority of sites. Additionally, the project

team in the aforementioned project noted that their early success encouraged

changes because the team felt they had a full understanding of the issues

when in retrospect they did not. These changes introduced unforeseen issues

and contributed to the ultimate failure of the project.

11

The Stop Light Problem

When a stop sign is replaced with a stop light, many drivers will for

several weeks stop at the light even when it is green. They do this

automatically, each suddenly noticing that he or she is inappropriately

stopped in moving traffic. Why does this happen? The answer gives us a clue

into the difficulties of assumed familiarity and the mechanics behind them.

The real world teems with a million different changing variables. The

fluidity and complexity of a basketball game, for example, presents

inconceivable mathematical complexity. Athletes have no time for analysis

paralysis; they must take a shot or pass the ball in a

split second.

We can perform these complicated tasks because we

are a race of model builders. When we face a

problem our cognitive structures, evolved through

millions of years, are designed to rapidly construct a

simplified model of the world. Our model discards all

the unimportant inputs and focuses on only the

essence of the problem. We literally discard

information that does not fit our model, so it does

not interfere with our important task.

In a classic experiment illustrating our task-based myopia, volunteers

focused on counting the passes in a basketball game. Halfway through the

game, a man in a gorilla suit walked slowly and incongruously across the

court. Half the volunteers, questioned later, had not recalled seeing the gorilla

12

at all. Their eyes saw it, but because it was not relevant to counting passes,

their operant model discarded the information.

In unfamiliar terrain that seems familiar, our models betray us by filtering

out all the “unimportant” details. When catastrophe strikes, later analysis

often shows it clearly signaled by “unimportant” details that the decision

makers filtered out until they became overwhelming.

Marrying the Model

Making models requires effort, and our brain is loath to discard a

functioning model because of a few inconsistencies. Instead we deny the

inconsistencies or downplay their importance, becoming progressively more

invested in our model. In financial firms, traders call this tendency “marrying

the model.”

This tendency is lethal to our successful operation when conditions are

changing. Especially when exploring unknown territory, we must see what is

really happening and react to it. Operating in a dynamic environment requires

a process based on convergence rather than one based on perfect advance

planning. When new requirements emerge or business conditions change, we

must modify our plans in response – and the sooner the better, because small

corrections upstream can easily solve problems that become formidable to

solve downstream. Realizing that we intrinsically and immediately make

models of our environment, we must resist the tendency to marry our model

and instead cultivate a willingness to discard it.

The LEAN process improvement methodology and the related Agile

software methodology represent systematic attempts to preserve our mental

13

and operational flexibility. These methods produce greater probability of

success by strongly resisting the urge to believe we know more than we

know.

The False Knowledge Trap

Business leaders need projections for advance planning, so they routinely

ask teams for time and cost estimates. But when launching exploratory

projects into new territory, we literally do not know enough to make any

worthwhile estimate at all. Our best estimates are based on the situation as

we currently understand it, and we know that we do not really understand it.

If we do not know what we do not know, we certainly cannot put valid

numbers to those unknowns.

Providing straightforward estimates for a highly uncertain situation creates

a false sense of knowledge. A number, once stated, carries the implicit

message that we understand the situation. The sense of uncertainty and risk

becomes lost. When we do not understand the terrain, we must be especially

vigilant not to quantify that false knowledge in dubious numbers and dates for

our leaders. Uncertain numbers end up conveying false certainty, false

certainty becomes enshrined in models, and models, once established, are

difficult to abandon.

14

15

A Monte Carlo simulation that randomly guesses numbers from 1 to 10,000 and keeps the lowest. Knowledge grows as more numbers are tried, but each individual number has no connection whatsoever with the next or previous numbers.

Finding Patterns in the Noise

The images on the prior page show a Monte Carlo generator, a computer

program that generates random numbers. This one is running a test

simulation that simply picks random numbers from 1 to 10,000, scores its

results by how far away from 0 they are, and displays the 10 best (those

closest to 0). Every time it chooses a permutation – that is, a random number

– it discards it from the pool and will never choose it again. The permutation

space gauge shows the percent of the permutation space explored – the

number of unique numbers we have chosen so far from the set 1 to 10,000.

If your simulation generates a few relatively low numbers, most people

will assume that the system is converging on better solutions – that is, that

the next number is likely to be “better” (in this case, lower) than the last.

The Monte Carlo solution does converge, but the numbers it produces for

results tell you nothing about its convergence at all. If you choose the number

1, your next choice might be 981, or 23, or 2; all those are all equally likely.

The only number that gives information is the percent of the permutation

space you have explored.

This simulation shows that people have a poor feel for randomness. Our

minds, quick to form models, will find patterns where there are none.

16

When launching projects across an enterprise, this tendency to believe we

have identified a pattern where there really is no pattern can lead to serious

project failure. If you pilot a system in 5 test sites out of 100 total, you may

still know very little about the next 5 sites – they can be completely different

from the first 5. If they turn out to be exactly like the first 5, you still know

very little; you have explored only 10% of your permutation space. What if 80

sites are similar but the other 20 are all radically different from each other,

and you have simply not found one of the 20 yet?

Exactly this problem trapped the team in the catastrophic project

discussed earlier. They assumed that their successful pilot had given them

information about how well the system would work at other sites. While this

assumption seemed reasonable, it turned out that the operating processes in

some sites were very different than those in others. Huge risks lurked in the

unexplored areas of the map.

Knowledge grows as we explore our permutation space. When we have

deployed successfully at all the possible sites, we know everything. When we

have deployed to half of them, we have reduced half our unknowns. We still

do not know what lurks in the remaining half of the sites, but at least we do

know about the half we have completed. When we have sampled only a few,

we know very little. For this reason, projects that send a basic prototype to

many sites learn much more than projects that send an elaborate prototype

to a few select sites. The successor system to the ill-fated ADGP project

deployed a very limited release as widely as possible. With the broad set of

experience they gained early-on, they succeeded where their predecessor

system had failed.

17

18

“Strange things happen near the boundaries”- James Gleick

Off the Manifold

The human tendency to assume more information than the data actually

has is best illustrated graphically. The image on the facing page shows a

complicated surface (a “manifold” to a mathematician). Sampling a few

points, indicated in green, we determine that those points have similar

values. If, for example, we speak to three customers and get similar opinions,

clearly we now understand our customer.

Or do we? The graphic illustrates that the real situation is much more

complicated, but we have simply missed the complicated parts. We might

find, for example, a healthy percentage of our customers have a totally

different requirement. At that time we discover that the solution that seemed

to meet our customer needs quite well is suddenly “off the manifold”.

19

20

“My model of survival is that stress is what occupies the gap between your realities and your expectations”

- Deborah Scaling Kiley

Conclusion: Catastrophe Revisited

If we now return to considering the failed business project we considered

earlier, its failure is no longer so perplexing. A little success can be a

dangerous thing; it leads us, flushed with excitement, to forget that we do

not know what we do not know.

Conversely, studying how and where projects unravel gives us a rich set of

tools with which to avoid similar derailments. Our cognitive engine, quick to

build models, will try to read too much into the data. We will tend to marry

our models, to give numbers to leaders that convey false certainty, and to

ignore small discontinuities until they become hard to deal with. Knowing

these tendencies, we can arm ourselves with techniques and methodologies

specifically designed to address these issues.

In the end, the project that seemed a total catastrophe yielded invaluable

lessons. Its failure laid the seeds of many future successful ventures into new

territory.

Any venture beyond the known involves risk. To operate successfully in

uncharted waters, we must acknowledge that we do not know the territory –

even though it looks familiar. We must realize that it harbors risk, and risk is

a normal part of exploration. Using techniques to identify, catalog, and

manage risk allows us to explore new territory, develop new competencies,

and ultimately to succeed in our commercial objectives.

21

Production and Concept Notes

Breaking New Ground was produced by the Office of Strategic Services. It

draws from the shared experience of the LEAN & Agile communities at Brink’s,

particularly the CIT Business Team.

About the Office of

Strategic Services

The Office of Strategic Services

(OSS) is a strategic consulting

and services organization that

seeks out opportunities to

create, align, and advise projects within our global family of companies. The

OSS collaborates with internal Brink’s clients to help them become high-

performance departments, branches, and product groups by achieving

innovative solutions to persistent organizational problems. For more

information, see http://pulse/oss.

22