Big fleas have little fleas
Upon their backs to bite 'em,
And little fleas have lesser fleas,
And so, ad infinitum.
I have no idea who wrote the above, but as a summary of why systems that incorporate human elements nearly always operate in failure mode, it's unsurpassed. Even in the absence of elements with coercive powers, most systems are in failure mode most of the time...because of their "fleas."
Ponder it a moment while I put up a pot of coffee.
First, let's agree on some fundamentals:
- A system, for the purposes of this discussion, is an organization with a declared purpose toward which it and its members are supposedly working.
- Smith is a member of system X if he has:
- Accepted, de jure, the authority of the system over his efforts;
- Accepted any form of compensation for those efforts.
- System X is in failure mode whenever:
- It is not advancing measurably toward its declared purpose; and:
- It is expending any of its resources: human, material, financial or otherwise.
(We must also agree that: some failure modes are worse than others; some are more perceptible than others; and some are more open to disagreement than others. In any discussion of human enterprise and human relations, there will always be gray areas within which there's room for differences of opinion.)
There's been a lot thought, said, and written about the habitual failure of significant systems. Many commentators have made accurate observations about parts of the problem. However, one ultimately piercing observation subsumes them all:
The key words in the above are wholly and sincerely.
If you're familiar with James Buchanan and Gordon Tullock's work in Public Choice Economics, perhaps you can see where this is headed.
As soon as a system acquires more than one member, it acquires more than one set of priorities, regardless of its declared purpose. This, like the abuse of power, should come as no surprise. Individuals possess individual motivations and goals, about which they're very seldom completely candid. Indeed, few of us are fully aware of our motivations and goals; this is in the nature of the human psyche. Thus, varying degrees of actual, as opposed to professed, commitment to the organization's purposes are inevitable.
It is also inevitable that as soon as a system begins to acquire resources, some of its members will view those resources as potentially applicable to purposes other than those of the system. Sometimes, this divergence is fan-danced by rationalization, e.g.: "I can keep working at home if I just take a few things from the supply cabinet / parts locker / petty cash box." Once in a great while, the member involved is perfectly sincere...at least at the outset.
We could go into rhapsodies about the fallibility of Man, about the frailties of human character, and so forth, but ultimately all such considerations, being premised on a set of abstract values which don't command universal agreement (or fidelity), are too personal to be used analytically. What does work analytically is to view systems as ecologies.
In the previous essay, I laid some foundations for an ecological approach to the study of systems, one of the most important elements of which was this:
If Smith, an individual member of the Acme organization, is not wholly and sincerely committed to Acme's purpose, then he will be animated by other purposes at least part of the time. That being given, Smith will be tempted to view the environment Acme provides him as a field of resources for his exploitation for those other purposes. To the extent that Smith succeeds in presenting, to Acme authorities above him, the appearance of good and valuable service, he will be permitted to continue in this practice.
Note also this: the "good and valuable service" need not be entirely -- on occasion, not even partly -- to Acme's purpose. If Acme is large enough to have a hierarchical authority structure, those above Smith will have some non-Acme priorities of their own. If Smith's supervisor Jones can exploit Smith as a resource for Jones's own non-Acme purposes, he will naturally be more tolerant of Smith's own peculations than otherwise.
Great fleas have little fleas, and little fleas have lesser fleas...
Highly applicable here is Robert Anton Wilson's "SNAFU Principle." The partial concealment of Smith's activities from levels of authority above him protects his divergences from full commitment to the organization's purpose. Even when "top management" is aware that it's not being fully informed about what's going on at levels below it, it seldom possesses the capacity to learn more than it's being told.
With this, it becomes clearer why corruption (as colloquially understood) varies combinatorially with the size of an organization. What's massively unclear is whether there's anything to be done about it.