Internal Controls Design website by Matthew Leitch (tutor, researcher, author, & consultant)
New website, new perspective: www.WorkingInUncertainty.co.uk - Related articles - All articles - The author - Services

Business Process Controls Design Graphic


The Natural Method of designing internal control systems

by Matthew Leitch, 18 January 2003


Why design?
- data quality meltdown
- persistent waste
- regulations
- stress
What to expect
- shortage of information and time
- resistance borne of desperation
What not to do
How to do it right
- getting into the project plan
- how to do high level design
- proposing work packages
- near "Go Live!"
Tips on some key control mechanisms
- process monitoring
- ergonomics
- comparing totals
- validation and edit checks
- segregation of duties
Finally

Please note

If you like the approach described in this article – and why wouldn't you? – the easiest way to master it properly is to engage me, the author, for some individual technical tutoring or teletutoring sessions.

It's not a difficult technique but I've noticed that most people still benefit from some help in getting it just right.

Why design?

This paper is about designing internal control systems (precautions we take to guard against error, fraud, or other perils) for business processes, such as billing, purchasing, and treasury management.

But why recognise internal controls design as a discipline in its own right? Why not assume it is done by everyone in the normal course of their jobs?

When a business creates these processes for the first time, or makes significant changes, internal controls will be established. This usually happens because people involved know from their education or past experience that bad things happen and they need to take precautions. For example, the IT people will worry about hackers, viruses, and disasters like fire and flood in the data centre. Accountants will be looking for reconciliations and approvals. Managers will want reports. And so on.

This organic, decentralised process works pretty well but it does have flaws, and these lead to some serious risks and inefficiencies.

Data quality meltdown

Almost all organisations have a huge investment in data - data about customers, suppliers, products, employees, and so on - gathered, checked, and stored in databases and files. Consequently, one of the most costly problems with new systems or processes is the data quality meltdown. Here is how it happens.

At the start of the implementation project task lists are drawn up and lots of good ideas about quality assurance and internal control are usually captured and put into the plan. However, as time goes on and things take a bit longer than expected the pressure builds. As people become stressed their focus narrows. Meetings are held at which people ask "What do we really, really need to do?" Little by little the quality assurance and internal control tasks get de-scoped and eliminated. Go live weekend arrives (three months late but it still seems like a triumph) and the champagne is opened. For a while everything seems to be going well, though people are struggling with the unfamiliar way of working.

Then the first evidence of problems starts to emerge. Someone runs a suspense report for the first time, not having seen it before, and discovers thousands of errored transactions have already built up. More checks are done and more problems emerge. Time to panic. More temps are hired and a crisis team is formed. Already overtime is huge, possibly with shift working.

But by this time the vicious cycle has got a hold. People often make mistakes when they try to correct mistakes. Reference data is already contaminated with errors and is generating more and more incorrect transactions. The extra work of correcting mistakes is leaving people tired and stressed, so they make more errors, especially when trying to correct errors, late at night, for the third time.

Recovering from this sort of meltdown can cost more than the original implementation. It is better to apply expertise to internal controls from the outset to minimise the risk of meltdown.

Persistent waste

Fortunately, data quality meltdowns are not common. However, wasted time and damaged customer goodwill are an almost universal effect of not designing internal controls.


. . . . .
There's more
The whole text of this article is freely available to you without registration by just clicking the link below. Please remember that this website exists because people (perhaps including you) express their thanks for its help in practical ways, such as thinking about how to use its ideas, my services, the book, taking part in research, suggesting topics, etc. Thanks for reading this and I hope you enjoy the full article.

Full article
© 2003 Matthew Leitch
New website, new perspective: www.WorkingInUncertainty.co.uk - Related articles - All articles - The author - Services