Internal Controls Design website by Matthew Leitch (tutor, researcher, author, & consultant)
New website, new perspective: - Related articles - All articles - The author - Services

Progression Graphic

Progressive risk control integrated with strategy and performance management
keeping risk management and internal control interesting and useful

by Matthew Leitch, 21 November 2008

Abstract: This article explores the idea of keeping risk control (i.e. risk management and internal control) interesting and useful to people by repeatedly improving the methods used by each team, building on their previous progress rather than just updating it. The idea is to keep up a flow of fresh insights. Along the way, risk control is closely integrated with other management thinking. The technical basis for the improvements is to develop mental models as the basis for management thinking, including risk analysis. The polls in the original, interactive, version of this article provided information about what most people think and you can see the results here.

Key words: risk management, internal control, performance management, strategy, Balanced Scorecard, Kaplan & Norton, Constructive Simplicity, Chapman & Ward.

Back to interactive version

Some common ground

Before getting into the technicalities of mental models and the derivation of risk analyses, let's establish some common ground about what would make sense for management thinking processes, if we knew how to do it.

Most people already believe the main principles promoted by the article, even though they sometimes conflict with common practices, so I hope the technical suggestions included in the article are useful.

For each of the questions readers were asked to click once on the button in the box holding the answer they preferred, after which the next part of the article appeared. They were shown the same text whatever answers they gave. Most people who started continued to the end, even if they didn't agree entirely with the views I express.

Poll 1 of 15: Assuming we have time to work on it if necessary, how should risk/uncertainty be managed?

Separately from other management thinking i.e. separate meetings, documents, specialist support manager perhaps.Integrated into other management thinking e.g. about performance, strategy, planning, resource allocation.

Information on what other people have answered in response to these questions is available from the first 100 people to answer at least one question and from a show of hands at a professional conference for IT specialists with an interest in risk management. In responding to this first question the overwhelming majority of the live audience and 87% (87 of 100) of online respondents preferred the idea of integrating consideration of risk/uncertainty with other management, rather than keeping it separate.

Integrating risk thinking into other management thinking might still benefit from the help of a specialist support manager, and it may help to pick out areas for particular attention. However, most people would like to do something other than hold meetings to talk solely about risk, keep separate lists of 'risks', and produce reports only concerned with 'risk'.

This is interesting because so much current regulation and guidance on risk control, despite mentioning 'embedding', leads to separate documents, meetings, and specialist support managers.

COSO's famous integrated framework for internal control tried to encourage a form of integration by saying that analysis of risk should start with objectives. What is the organisation trying to achieve? What are the risks affecting its success? Unfortunately, this style of analysis has tended to lead to inward looking analyses that end up re-stating, in risk terms, plans that do not deal with risk. Also, in practice it is still something that people with an interest in risk control do after the big decisions have been made by others.

In general, results seem to be poor if top people make strategies without worrying about risk and then expect lower level managers to somehow deal with risk in such a way that achieving the set strategy is assured.

These problems can be solved and it helps to look at things from the point of view of the people making those big decisions.

Poll 2 of 15: In thinking about performance, strategy, plans, business models, resource allocation, etc, what should we do with uncertainties in our thinking?

Explicitly note them, think about them, discuss and respond to them where we can.Just go with our best guess at any time and ignore uncertainties.

Most people in my live audience and 93% (83 of 89) of online respondents thought that in these situations uncertainties should be noted and dealt with, not ignored. This is consistent with the view that risk/uncertainty should be dealt with in an integrated way.

Now is a good time to clarify what I mean by 'strategy and performance thinking'. This phrase is being used to mean thinking about what to do, and why, and how, regardless of where in an organisation it is carried out. For the board of directors of a large company only really big stuff is strategic. For the team in the post room, trolley selection could be a huge decision. Each team is doing something similar, but at a different organisational level. They are thinking about what would be an improvement, what the current situation really is and how the world they care about works, what actions might be taken and what effect they might have, how to use their resources, and what they should do next. Surrounding all of this thinking are numerous uncertainties, otherwise known as 'risks'.

The majority view, that these uncertainties should be dealt with rather than ignored, is interesting because so much of strategy and performance thinking works on a 'best guess' basis, with a tendency to try to achieve consensus on things that are uncertain rather than just agree where points are uncertain and deal with the uncertainties properly.

Awareness of uncertainty and its importance can be very helpful in deciding what to do next in a strategy/performance thinking process, so in this way it is helpful rather than complicating.

Of course, thinking about risk and uncertainty is only attractive if we have quick and effective methods for doing it. If our approach is to itemise all possible outcomes in minute detail and guess probabilities and impacts for them then it will take so long that little useful will get done. Our techniques need to control the level of detail and be orderly to limit complexity.

Poll 3 of 15: In thinking about performance, strategy, plans, business models, resource allocation, etc, when should we start acting on our thinking?

Try to work out a set of answers that are definitely the right ones and only then start implementing them, without modification.Implement what we can early and keep improving our thinking at each meeting and turning the improvements into better action.

Most of my live audience and 90% (66 of 73) of online respondents preferred the evolutionary style, perhaps aware that in practice that's what real life forces on us. As things move on we will (or should) learn and change our ideas accordingly.

We also must accept that we can't do it all perfectly at the first attempt. Our methods for integrated management thinking - incorporating risk control - need to allow us to start simply and build from there, without being paralysed by analysis.

The first attempts may be little more than some ill-coordinated objectives or priorities and perhaps a list of areas of uncertainty that seem important. Over time and successive sessions of thinking these will develop more structure, more interconnections. They will be shown as diagrams as well as lists. Facts will be collected to resolve some of the uncertainties. Some parts of our thinking will be illuminated by quantitative models, perhaps on electronic spreadsheets.

Poll 4 of 15: In thinking about performance, strategy, plans, business models, resource allocation, etc, when should we expect benefits?

Expect benefits from our thinking only when it is finished.Expect benefits from our thinking as we go along, with each improvement increasing the benefits.

The live audience overwhelmingly thought they should expect benefits along the way, even though the thinking wasn't perfect or completed. Of online respondents, 96% (65 of 68) agreed. This is consistent with the practical need for an evolutionary approach.

Poll 5 of 15: In a large or medium sized organization is all management thinking...

... part of one integrated, coordinated process, typically with central control; orthe combination of thinking by different groups, at different levels, with different perspectives, not necessarily coordinated at all times?

Most of my live audience and 94% (59 of 63) of online respondents thought that thinking in organisations was more spread out, and not necessary coordinated at all times. This again probably reflects what is achievable but one might also argue that diversity, loose coupling, and multiple perspectives make organizations more robust.

The point for risk control is that visions of one giant, coordinated system for managing risk, or management thinking more generally, are probably unrealistic. It is more realistic to imagine many management teams at different levels of different organizational sub-units, looking at different topics. Each team will be at a different level of maturity on each topic, sometimes making steady progress upwards, and sometimes sent down again by unexpected events.

Types of improvement

Now it's time to think about what kind of improvements we would value in a risk control approach.

Poll 6 of 15: Would a more accurate/true understanding of how the world works, capturing more causal links between events, be an improvement to thinking about performance, strategy, etc, all other things being equal (e.g. complexity being equal)?


Most of the live audience and 94% (60 of 64) of online respondents agreed that a more accurate understanding of the world was desirable, all other things being equal. This leads into several easy ways to improve management thinking.

As mentioned above, the first musings towards a risk robust strategy/plan or process are usually just ill-coordinated, badly defined thoughts put down as lists. A risk register is just a list. A list of objectives is just a list. What lists do not show is how one thing leads to another. Very often when we look at a set of objectives it can be seen that achieving some of those objectives would be progress towards achieving others, either by definition or because of causal links. Similarly, very roughly half the items on most risk registers have a causal link with another risk register item.

Once people have got the first ideas down and played with them a bit they get enough familiarity to go on to try to link them. At first it may be just some of their ideas that seem to be particularly closely linked. Eventually most or even all the ideas (or what replaces them) may come to be put into a scheme that captures how one thing leads to another.

This applies in particular to actions we might take. At first, these float around only loosely linked to our models of how things work. Eventually they get more fully integrated. For example, 'controls' are often listed against 'risks' to show coverage, but many 'controls' have knock on effects that we might want to understand better. To do that we need to weave those actions into thinking that captures their knock on effects as well as their immediate effect on the 'risks' in questions.

Another way that accuracy improves is by replacing ideas about how the world works that are wrong often with ideas that are wrong less often. We may not like it, but many strategies and associated risk assessments are based on beliefs about the world that are little more than guesswork. In other cases we may be fairly sure of things but our success may depend so much on being right that the slight chance of being wrong is a serious matter.

A simple step forward is to mark our thinking in some way to show where our assessments have come from and how much experience (including historical data) supports our views. Where this shows there is little to support something that is important we should be motivated to find out more, perhaps by more work with data we already have, or by experimenting.

This technique can be applied to risk analyses when done separately, to other management thinking done separately from risk thinking, and also to integrated management thinking.

Poll 7 of 15: Would a clearer, easier to understand, easier to use and explain view of how things work, with less complexity, be an improvement to thinking about performance, strategy, etc, all other things being equal?


Most of the live audience and 93% (56 of 60) of online respondents agreed that a simpler, easier to use view of the world was desirable, all other things being equal.

The practical improvements this suggests include improving the clarity of text used to capture our thinking. For example, the text describing 'risks' in most risk registers today is so vague that it is difficult to tell exactly what is meant. Careful inspection and feedback, with some simple statistics, can be used to improve the text dramatically. The same applies to the wording of most objectives.

Turning lists into interlinked models shown as diagrams, with boxes connected by arrows, makes the connections between things easier to understand, which is another reason for moving in this direction.

Poll 8 of 15: Would an understanding of the world that had more detail in key areas, with a better focus on what matters, be an improvement to thinking about performance, strategy, etc, all other things being equal (e.g. complexity being equal)?


Again, most of the live audience answered 'Yes', as did 97% (57 of 59) online respondents. What this is confirming is that, of course, there are many ways that management thinking can improve. This is one reason that there are many opportunities to improve management thinking, and that in turn means many opportunities to refresh the thinking and gain new insights that lead to new and valuable actions.

The downside of greater accuracy is often greater complexity. Developing accuracy and detail only in key areas is a good way to manage the trade off. A good way to do that is to write down a very simple initial set of ideas, then reflect on them, thinking about what parts are most worth developing further.

A good example of this is an approach introduced by Chris Chapman and Stephen Ward that they call Constructive Simplicity (references are given at the end of this article). They are interested mainly in quantified models for decision making that have risk and uncertainty explicitly shown in the model. Their technique is to begin with a very simple yet fully quantified model - possibly little more than a couple of formulae - that includes some numbers for risk. Using this model gives information about what uncertainties matter most and it is these that will get attention in the next wave of model development. This process simply repeats as often as necessary, usually several times.

The model can be developed in a number of different directions. It could be extended back into causes, or forward into effects. It could be developed down to a more detailed level. It might use more sophisticated mathematics to represent the uncertainty distributions. It might be moved into a computer tool that can compute the risk numbers easily.

Poll 9 of 15: Would an understanding of the world that was better quantified be an improvement to thinking about performance, strategy, etc, all other things being equal - including the skill and work needed?


Again, most of my live audience and 90% (53 of 59) of online respondents thought they would value more quantification, in an ideal world.

Can it be done? We're used to fully quantified thinking about money, but quantifying other aspects of management thinking is more controversial. The mistake is to think that only a fully quantified model is useful and, since a fully quantified model is often hard to create let alone make accurate, that must imply that no quantification is worth attempting. This is wrong.

In reality even a little progress towards quantification can be useful. When trying to choose between alternative strategies it is helpful to have a rough quantitive idea of the strength of certain relationships. For example, sales are only part of business success, but if we know that one tactic raises sales by between 5% and 15% whereas another tactic raises sales by only 0% to 3% then that makes choosing easier than it would be if we just had two tactics in mind that seemed, in principle, to be likely to work to some extent.

Poll 10 of 15: Would an understanding of the world that had wider scope, including more factors that were relevant, be an improvement to thinking about performance, strategy, etc, all other things being equal?


Again, most of my live audience and 93% (55 of 59) of online respondents agreed that wider coverage would be valuable.

This is not the end of the list of ways to increase the value of management ideas but it's enough to make the point that it's a long list.

Summary of conclusions so far

Most people seem to agree that, ideally, thinking about risk would be integrated with thinking about strategy, performance, planning, etc and that it should be possible to start simply and develop the thinking in steps, with increased benefits at each level. This is something that will be done more or less independently by different management teams and different levels, even though they will naturally share their thinking from time to time. Also, there are many things that would be seen as improvements.

I call risk control that works like this 'progressive risk control' because it moves ahead progressively and because I think it is the future for risk control.

How does this compare with risk management and internal control as commonly practised in organizations today? The formal parts of those systems seem to be, typically, separate from other management thinking and stuck at one level of sophistication. Either things stay at the crude level achievable with a few hours of workshops, a flipchart, and a bit of voting, or at the other extreme, a boffin works for months on a mathematical model. In both cases people get bored. They get bored stuck at a crude level with thinking they increasingly see as flawed and that is not advancing. They also get bored waiting for a model that is supposed to answer every question but ultimately doesn't inspire much confidence.

Happily, informal risk control is part of everyday thinking, tends to improve through experience and effort, and is practised by a lot of people independently. It could be done better and it would be helpful if the formal processes encouraged this. Your progressive approach needs to work in a natural way, which leads us to the next topic.

The role of models

Now it's time to look at a key technical breakthrough that can help make progressive risk control a reality.

Poll 11 of 15: Are our analyses of risk/uncertainty driven at all by our mental models (beliefs) of the world, how it works, and what would happen if we took various actions?

No, our beliefs about the world have no relevance to, or effect on, risk analysis.Yes, our beliefs about the world influence or even determine our risk analyses.

95% (55 of 58) online respondents agreed that our beliefs about the world influence or even determine our risk analyses.

I asked a very similar question in a separate survey during 2008. The survey asked people if they thought alternative, valid analyses of risk in the same situation were possible. It also asked them about possible reasons why alternative analyses might be possible. 87% of respondents in this survey believed that alternative, valid analyses of risk were possible and the most widely supported reason for this was having 'alternative perspectives or models' (supported by 86% of the 87%). In other words, what we know and how we structure that knowledge drive how we analyse uncertainties in our knowledge.

The technical breakthrough that can help us see more clearly how risk thinking can be integrated with other management thinking is the realisation that best practice risk analysis and best practice performance/strategy management are now both based on the same thing: causal models of how the world works that incorporate possible actions we might take.

The leading edge of risk management occurs where great skill, data, and computer power are brought to bear. (Impressive though these analyses are, they are not perfect, as some bankers used to think.) Typically, this involves a mathematical model of a business (or a part of it) represented on a computer and calibrated by using data about past events. The model is a collection of variables (e.g. 'profit', 'headcount', 'transactions per day') linked together by relationships represented by mathematical formulae. These relationships usually capture causal links between things. For example, the 'sales' variable might be driven by the 'sales effort' and 'market demand' variables, among others. Some variables represent things we value in some way, such as profits or healthy patients.

Risk/uncertainty comes in because some or even all of the variables and relationships in the model are uncertain. Typically, we don't know the future actual values of the variables and we have some big uncertainties about exactly how the relationships work.

In this kind of analysis our uncertainties about variables and relationships are the risks and the extent of those risks is represented by information about how the potential values are distributed. For example, if the future value of 'sales' could vary over a huge range quite easily then the risk is higher than if you know it with high confidence to within 2%.

(You may be wondering how this compares to risk management done by brainstorming ideas and putting them on a risk register. My studies of actual risk registers show that it compares very closely. It's as if people have models in their minds that drive the risk ideas. For example, a typical risk like 'Fire damage to our retail outlets' is clearly an uncertain variable, whose probability distribution is something we could form a view on, and where our interest is mostly in the more extreme levels of damage. My observation is that virtually all risk register items can be resolved into one or more variables in this way. In other words, in a brainstorming workshop we still have mental models in operation, it's just that they are not explicit and not shared.)

So, in summary, the leading practice in risk management is based on explicit models.

In the field of performance/strategy management explicit models are also emerging as central. In Kaplan and Norton's approach to using their Balanced Scorecard strategically the model is a called a 'strategy map'. Typically, strategy maps are drawn with boxes joined with arrows. The boxes represent variables (or groups of variables) while the arrows represent relationships, usually causal.

In contrast to models built to study risk, strategy maps do not usually show uncertainty explicitly. That's something that has yet to happen. However, Kaplan and Norton are keen to point out that uncertainty in a strategy map needs to be taken very seriously indeed. They write about 'strategy as hypothesis' and encourage activities to gain facts to support or refute the hypotheses.

Poll 12 of 15: If an organization is in a position to create explicit models for risk analysis and for strategy/performance thinking, relevant to a particular aspect of its business and at the same organizational level, how should it organize the work?

Separate teams, meetings, documents, and models for strategy and for risk.One team, one set of meetings, one set of documents, and one set of models.

72% (42 of 58) online respondents thought that one team would be preferable. Contrast this with the tendency, in the past, for risk work to be done separately.

One practical point that makes the integrated approach easier is that lists of risks can be generated automatically from a model, even if it isn't fully quantified, so that responses can be linked to risk items in the traditional way if desired.

The derivation of a risk analysis from a model can be done in a number of ways but at the lowest level there are risks for past, current, and future values of each variable, for the past, current, and future characteristics of all relationships, for the desirability of achievements, and for the structure of the model itself. Risk analyses in the past have tended to be quite good at the uncertain future values of variables, but often miss out the relationships, model structure, and desirability of achievements. Deriving a risk analysis from a model is both easier and more rigorous, and involves nothing more than making a list from things we can see in a diagram.

What improvement should we implement next?

Poll 13 of 15: Do you think there is likely to be one best sequence of steps that leads from crude early thinking to the level of sophistication that is best in the long run?

No, there is unlikely to be one way that is best in all situations.Yes, there should be one way that is best in all situations.

79% (46 of 58) online respondents thought that there would not be one best way. You might already have a lot of separate risk stuff and separate strategy/performance stuff. In that case it makes sense to develop each in steps and gradually integrate them. In contrast, if you were starting from nothing you would start integrated and just develop a bigger set of models.

Furthermore, how you develop those models will depend on where they seem weakest at each stage, on what skills you have in the team, and so on.

Poll 14 of 15: In your organisation, would you typically be starting from nothing or from sets of existing documentation, e.g. risk registers, critical success factor lists?

Starting from nothing.Starting from existing documentation.

88% (51 of 58) online respondents thought they would be starting from existing documentation.

Poll 15 of 15: If you personally wanted to take one progressive step (perhaps to link things up more, or lift quality, or increase quantification, or make a diagram to add structure to a list) would you...?

... need permission e.g. because company procedures require a fixed approach that cannot easily be changed, or ....... just do it.

This question was not in the first version of this article but the statistics so far show that 73% (32 of 44) of online respondents thought they could just do their first step. While a worrying 27% of respondents feel unable to make any improvements without permission it is encouraging that most thought they could do something without having to wait for permission.

You will have noticed that the polling results generally agree with the direction of this article. This is not just because people who disagree tend to drop out early on. The majorities are still overwhelming even if we just use the results of people who answered every question.


Thank you for reading this far.

I hope you've been prompted to consider seriously how you could apply these ideas in practice yourself. Clearly, most people think that integrated consideration of risk that gradually improves over time while still helping to drive actions is a good thing. That doesn't prove it is right, but it does mean that if you think it's a good idea you should get strong support if you suggest it and make sure everyone's views are noted, not just those who disagree. If you or anyone you know would like to discuss these issues then please get in contact.

If you are interested in exploring some kind of integrated or progressive management method you might find that this article helps you test receptiveness to the ideas. I can set up a private poll for just colleagues you nominate and you will learn the results as well as, very likely, getting a few people interested. Again, please get in touch with me.

Further reading

How to embed risk management into performance management and strategy making' by Matthew Leitch takes a broader perspective in some ways.

'Competitive Engineering' by Tom Gilb has excellent material on many things, including inspections to raise quality of documents.

'The Strategy-Focused Organization' by Kaplan and Norton is one of the classic books covering the use of strategy maps.

'Managing Project Risk and Uncertainty: A Constructively Simple Approach to Decision Making' by Chris Chapman and Stephen Ward is the main text on Constructive Simplicity.

Matthew Leitch - Author

About the author: Matthew Leitch is an independent consultant, researcher, and author specialising in internal control and risk management. He is a Chartered Accountant with a degree in psychology whose past career includes software development, marketing, auditing, accounting, and consulting. He spent 7 years as a controls specialist with PricewaterhouseCoopers, where he pioneered new methods for designing internal control systems for large scale business and financial processes, through projects for internationally known clients. more

© 2008 Matthew Leitch
New website, new perspective: - Related articles - All articles - The author - Services

If you found any of these points relevant to you or your organisation please feel free to contact me to talk about them, pass links or extracts on to colleagues, or just let me know what you think. I can sometimes respond immediately, but usually respond within a few days. Contact details

Matthew Leitch - Author

About the author: Matthew Leitch is a tutor, researcher, author, and independent consultant who helps people to a better understanding and use of integral management of risk within core management activities, such as planning and design. He is also the author of the new website,, and has written two breakthrough books. Intelligent internal control and risk management is a powerful and original approach including 60 controls that most organizations should use more. A pocket guide to risk mathematics: Key concepts every auditor should know is the first to provide a strong conceptual understanding of mathematics to auditors who are not mathematicians, without the need to wade through mathematical symbols. Matthew is a Chartered Accountant with a degree in psychology whose past career includes software development, marketing, auditing, accounting, and consulting. He spent 7 years as a controls specialist with PricewaterhouseCoopers, where he pioneered new methods for designing internal control systems for large scale business and financial processes, through projects for internationally known clients. Today he is well known as an expert in uncertainty and how to deal with it, and an increasingly sought after tutor (i.e. one-to-one teacher). more

Please share:            Share on Tumblr