6
122 Computer Security - Prevention: Lessons from the Operation of a Nuclear Power Plant L. Hoebeke International Institute for Organisationnl and Socral Develop- ment, Leuuen, Belgium This article demonstrates the means of implementing an ongo- ing security prevention system, starting from the design phase of information systems and continuing through all other project phases during the whole life cycle of the system. Keywords: Prevention design, Redundancy, Simulation, Vari- ance-resistant systems Luc Ho&eke, civil engineer, is a partner and senior consultant in IOD, the International Institute for Organi- sational and Social Development. He is a member of the International Con- sultants Foundation, the Steering Group of Management Sciences of the Royal Flemish Engineering Associa- tion, and the Systems Groups Nether- lands. As a consultant he is involved in organisational design and the intro- duction of technological innovations in organisations, project- and strategic management, and their introduction as a permanent or organisational activity. He works for both profit and non-profit organisations all over Europe. Before joining IOD, Mr. Hoebeke had managerial experience heading and designing the E.D.P. department of a multina- tional in the automobile industry. Previously he worked for a European main frame computer manufacturer as project leader for the introduction of information processing equipment in client organisations. Luc Hoebeke graduated as a Civil Engineer at the University of Leuven, and he started his career with a teaching assignment in Columbia. North-Holland Computers & Security 5 (1986) 122-127 1. Introduction Current technological development seems to be directed by a strong positive feedback cycle or, in non-engineering terms, by a vicious circle. Techno- logical innovations are creating problems which can seemingly only be dealt with by more techno- logical wizzardry, which in its turn create new problems and so on. An example is the area of computer security. More than by anything else, the paradox of the vicious circle is illustrated by information technol- ogy. Its major aim is to provide wherever needed, as soon as possible and as accurately as possible those data necessary for running our overcomplex systems. The nearer this aim is approached, the more we have to create systems to protect the data flow from frauds, crooks, unauthorized persons and other threats. Thus while we create systems to make data more available, at the same time we must reduce this availability by security measures such as encryption, privacy laws, patent and copyright laws. One of the rationales behind this absurd be- haviour stems from the widespread bureaucratic frame of mind well described by Acar and Aup- perle 111 who stated its fundamental assumptions: the world is machine-like, deterministic, analy- sable and predictable. Tasks are decomposable into routinizable subtasks people are machine-like. They can be predicted and monitored, and are motivated by extrinsic rewards large systems are more difficult to destroy or constrain than small ones. This frame of mind leads to the following con- tradiction every security officer is confronted with: the only way to protect a system is to foresee the unexpected and take measures to prevent it. In- 0167-4048/86/$3.50 0 1986, Elsevier Science Publishers B.V. (North-Holland)

Computer security — Prevention: Lessons from the operation of a nuclear power plant

Embed Size (px)

Citation preview

Page 1: Computer security — Prevention: Lessons from the operation of a nuclear power plant

122

Computer Security - Prevention: Lessons from the Operation of a Nuclear Power Plant

L. Hoebeke International Institute for Organisationnl and Socral Develop-

ment, Leuuen, Belgium

This article demonstrates the means of implementing an ongo- ing security prevention system, starting from the design phase

of information systems and continuing through all other project

phases during the whole life cycle of the system.

Keywords: Prevention design, Redundancy, Simulation, Vari- ance-resistant systems

Luc Ho&eke, civil engineer, is a partner and senior consultant in IOD, the International Institute for Organi- sational and Social Development. He is a member of the International Con- sultants Foundation, the Steering Group of Management Sciences of the Royal Flemish Engineering Associa- tion, and the Systems Groups Nether- lands. As a consultant he is involved in organisational design and the intro- duction of technological innovations

in organisations, project- and strategic management, and their introduction as a permanent or organisational activity. He works for both profit and non-profit organisations all over Europe. Before joining IOD, Mr. Hoebeke had managerial experience heading and designing the E.D.P. department of a multina- tional in the automobile industry. Previously he worked for a European main frame computer manufacturer as project leader for the introduction of information processing equipment in client organisations. Luc Hoebeke graduated as a Civil Engineer at the University of Leuven, and he started his career with a teaching assignment in Columbia.

North-Holland

Computers & Security 5 (1986) 122-127

1. Introduction

Current technological development seems to be directed by a strong positive feedback cycle or, in non-engineering terms, by a vicious circle. Techno- logical innovations are creating problems which can seemingly only be dealt with by more techno- logical wizzardry, which in its turn create new problems and so on. An example is the area of computer security.

More than by anything else, the paradox of the vicious circle is illustrated by information technol- ogy. Its major aim is to provide wherever needed, as soon as possible and as accurately as possible those data necessary for running our overcomplex systems. The nearer this aim is approached, the more we have to create systems to protect the data flow from frauds, crooks, unauthorized persons and other threats. Thus while we create systems to make data more available, at the same time we must reduce this availability by security measures such as encryption, privacy laws, patent and copyright laws.

One of the rationales behind this absurd be- haviour stems from the widespread bureaucratic frame of mind well described by Acar and Aup- perle 111 who stated its fundamental assumptions:

the world is machine-like, deterministic, analy- sable and predictable. Tasks are decomposable into routinizable subtasks people are machine-like. They can be predicted and monitored, and are motivated by extrinsic rewards large systems are more difficult to destroy or constrain than small ones. This frame of mind leads to the following con-

tradiction every security officer is confronted with: the only way to protect a system is to foresee the unexpected and take measures to prevent it. In-

0167-4048/86/$3.50 0 1986, Elsevier Science Publishers B.V. (North-Holland)

Page 2: Computer security — Prevention: Lessons from the operation of a nuclear power plant

L. Hoebeke / Computer security, lessons from n nuclear power plant 123

deed, once the unexpected has been foreseen it ceases to be unexpected whereas what really is unexpected cannot be foreseen.

This article is based upon another framework: the characteristics of viable systems as developed by S. Beer [2,3], which in our daily experience are rather variance-resistant and cope with unexpected environmental disturbances in a mostly adequate way.

But as technology never is an open system, and so not a viable one, the scope of security has to be broadened to that essential interface which makes a technological system viable: the connection be- tween technology and the human being.

Nothing less than the TM1 accident revealed the importance of this interface for the nuclear

industry. The author, together with H. Michiels and L. Janssens [4], has had the experience of the partial debureaucratization of the operations of a nuclear power plant. The lessons drawn from that experience which are relevant for computer and data security are highlighted in the next sections.

2. An Alternative to the Bureaucratic Way of Dealing with Security

2.1. Ashby’s Law of Requisite Variety

ONLY VARIETY CAN COPE WITH VARIETY

Variety is a way of measuring the complexity of a system and states the number of possible states in which a system can be. Variety increases more than exponentially when the number of connec- tions or interfaces between the components of the system increases. Taking into account the way information and communication technology is de- veloping, it is evident that the complexity informa- tion security officers are confronted with becomes immeasurable - a complexity they have to deal with by generating an equal amount of complex- ity.

R. Espejo [5] illustrates how Ashby’s law is being treated in a bureaucratic way and leads to a basic security dilemma. When something unex- pected happens (environmental disturbance, Fig.

EKVIRONMENTAL

DISTURBANCCS

ASK

FOR

& &&?I&

LEADS

TO

CREATES

~~...~~

PROCEDURES

Fig. 1.

Page 3: Computer security — Prevention: Lessons from the operation of a nuclear power plant

-woa lnoyl!M l! ww8o1d uw 1awwe18o1d B l~t?y) LIalaIdwo:, OS u$sap sy ICJpads pInoys 1au@ap v ‘uralslCs ayl ale1ado lou pInoys 1awuIs1So1d e ‘ure1So1d IOU plnoys 1olp1ado uv ml!Ids 6Il+ls aq plnoqs sqsel sno!1eA ayl lr?yl waura1rnba1 s,1olTptw dpa ayl pu~3 ua~30 am aprs uo!ws~u&1o ayt uo

.huepunpa1 ly1ad lou op pur? u$sap uoyeDg!Dads alaIdwo:, t? saw3 lsow u! a1Fnba1 suo~les~ue8.10 30 slapour alea .Lydosolxd u$sap Dyseq 1rayl u! sa%n?y~ awes aqnba1 11~~ wais/Cs astzqtwp B oiu~ ewp atuw aql 30 saI1lua luapuadapur %uglv1ad lna .pauoyuatu saya!lc% lay10 ayl 30 is03 ayi ueyl ssaI yDntu LIlsow s! uoy -XXIWJ lndur aql 30 iso3 aql ‘paapy ‘inauaq-lso3 !“a1 1!ayl 103 pas+ue aq pInoys ‘suaddy luaAa

,,palDadxaun ,, amos ‘r?lap asaql 30 Ou~~puey Icu -1alur aql 30 hFaIdtuo3 ayl ySno.ryl ‘uaym %nqap 01 luads atug ayl put? ‘sq3ay31aluno:, pun syDay3 uogt?p~le~ olu! lnd SF y3y~ UI!E U%!Sap lr?yl yS2a1 01 h2ssaDau L81aua ayL walsds aqi olu! a3uo

lcIu0 pa1alua aq ol ahv!y wisp ayi wqi s! saseqewp 30 asn aqi 103 siuaur&e pIos lsaq ayi 30 au0

.asnsEaw sg 30 8u!ut2aur Itza1 ayl palwodro~u! aAr2y ol lou waas suaisk uo~~wu103u~ 30 saId!Duvd uo!lesyue%10 PUB uS$sap paldaDD,r? LIuou~wo~ atuos ‘1aAaMoH .sasueq1nlsrp ~JIM ado3 01 ADu?z?punpa1 30 aldurexa UET ‘sl1ed a1eMp1ey Iarluassa 30 uogwldnp ayl II! palys -uo3 stualsk awg-[aa aaq-a1nIFv3 ls1!3 ayl 30 au0

‘palq%@y aq II~M Kiun3as swaisk uoyz~111o3u~ pun 1alndwoD 103 luwaIa1 pue lut?Id Jarnod 1eaIDnu B 30 uoye1ado aql ol it BuyilIddE 30 acwal1adxa s,1oylne ayl ~013 u~eip yDeo1dde syyl 30 sacwanbasuo3 ayl30 amos uoy3as lxau ayl UI

[9],, .a%is y3ea le sisEa waists Ies -yuyDaloraos aIqe!A B leyl yns aq ase3 sgl ur pInoys uZ!~sap Iw~uq3al aqL yl~o18 30 saSsis aAFssa33ns y8noiyl 08 01 paau II!M lnq dals auo u! paluauraldw! aq LlaJw uw le pawye walsk aqL .doIaAap ol adh syi 30 waisds I? 103 alqrssod ayctu wql suog!puo~ ayi a.nz pagI3ads aq 01 sey W~M ‘1aylw :(paldawz pw paugap aq ol alley 11~~ ‘1~2 patu!e a.rB q3g~ ‘tuaisk sy 30 wps!1aperw_j:, ayl q%noylIt?) uralslCs p~u3 ayl 30 uoye~~3pads

t! 30 ls!suo~ IOU 111~ u+sap ayl ‘scualsh snow -ouolnE? alqE?!A luauIaIdy ol lur?m aM 31 ,,

[91“ .asuods

-a1 alyldorddv JSOLLZ ayl 30 uoylDaIas l auro3lno aqi 30 uopwyeha pup

su1alwd asuodsa1 agr?u1alIr? 30 %ugsal ayl l su1allEd asuods

-ai aAgeu1alIe alea ol Iclg!qg1_‘t2A Ikwialu! 0 :dlIw,u!u~ a1Fnba1 1noFAeqaq lua%I

-Ialuy pue agEal puv %u!uieaI ‘luaurlsnfpv 5a%.wy:, Iwuawuoiyua 01 saAIastuayi lsnfpr? UBD lyl put? u1eaI ut?3 ieql swalsh ylrrn u1aDuo3 e so yDeo1dds sql 103 uowa1 U~XI ayL .uogeagpads Icy!117 IEXLI~U~ glrm lnq uopwg!3ads paI!“lap alaIdwo3 ~I!M paulaD -uo3 1a8UOI ou sy q3g~ u@ap 30 waIqo1d ayl ol yceo1dde Mau e so aiay pu!3 aM lack,,

wIals,k (p+?oIorq) aIqt+ ~013 aId -urexa UE saycl ‘iaaa ‘s st? IIaM se ‘aH wuaisds 30 @sap ayl u! q1ohzaww3 3y13neamq ayl 01 ayleu -rallr? UE 01 lu!od 01 ,,u%lsap uogw3’3ads IEDF!.ID,, tuiai ayl pauTo3 ‘wopaaq pue huouoltw 30 aa1Sap B ailv?y Qeguassa y3y~ stua1s.k Iw~uy~aloy~os %upIyq u! aDua?adxa ST 1~013 Oyl1els ‘[9] lsq1aH

‘aIq -!xaW snyl pur! aIq?+ SF qcq~ ‘tualsds uado UB sk? aAvyaq 01 aIqz lou snyl put! (sde% %u!sop lo sw!a 1~ 1aquratua1) au0 pas013 B LIpguassa s! sampa -old pun suog~wsuy ‘saxg Ie+oIouy3al 30 walsk I? i~ql s! sg 103 uoseai Ir?~!laioayl ayL .saseaiw! uralsh ayl 30 uoywado ayl 30 6l!p$$ ayl a1ou.x a91 ‘uayq a.w sainseam asayl a1otu ayl ‘s~ouy ,‘saIni ayl 01 ~U~JOM,, palIE uog3e Ieuisnpu! aqi ql!~ aDua!.radxa peq sey OHM auolCraAa st2 ‘ina .sa1npaDo1d 10 suo!lDnilsu! ‘saa!Aap ‘sa1nwaw lcl! -1nDas Mau Kq de% l~yl 1It3 01 ST xaua1 Dys13n&?amq aqL .tualslCs Llptwas aql u! ,,d&,, B suvatu syl (saop 1aAau 1~ sasw Xuwu UT ‘IcIg3nI) lr 30 a1tzMV sawo3aq 11 uaqw ‘luatua%wt?ur iCl!inDas iod

X@~O sl! se aIqewpa1dun SE SF uop~ -sue11 snowouolnr2 f.112 ina .dla!ieA Iewauruo.r!Aua ayl ql~~ ado3 01 Qapwt al!s!nba1 aql alwaua8 Layl kualsk ayl 30 IClgIqerh aql 103 Isyuassa a1e suoy -c02sueq snowouoln\j woycesue1l snotuouolne

UE paIIw s! SI~_L fssaupawdxaun ayl 30 asnwaq) saIn1 8urlsFaa1d uodn paseq lou st leql uo!l~ ue ayEi II~M auoawos walsks aql u! aiaymawos ‘alqr+ ST tuaisds Ie~ruy~al-opos alaIdruo3 ayl 3’ ‘uaql ‘(1

Page 4: Computer security — Prevention: Lessons from the operation of a nuclear power plant

L. Hoebeke / Computer security, lessons from a nuclear power plant 125

municating with the user, and the user should tem is implemented, but is a joined, enduring

specify his requirements so that they can be trans- activity by all interest groups that are involved in

lated into a completely specified design. the design, operation and maintenance of the sys-

The lessons taught in the nuclear power plant tem. This requires redundancy in time by reiterat-

show that: ing project activities and overlapping knowledge.

1. When there is real communication between security officers, engineers, operators and management in groupwork, a variety-generating capability is created which provides immediate security enhancements, e.g. unfeasible proce- dures are eliminated, understanding of priori- ties is shared.

3.2. Some Redundancy Organizing Principles

Variance-resistant systems being socio-technical systems must have the capability to develop and to learn continuously. That means that development activities and operational activities cannot be organisationally split.

2. The main obstacle to start real communication is the way organisations personalize errors by looking for “ the one” responsible and refuse to tackle the necessary interdependencies between

competences and functions. 3. The implicit theory that individuals are more

thrustworthy or easier to control than groups suggests a picture of groups as dangerous mobs, to the neglect of their variety-generating capa- bilities.

The learning cycle starts with shop-floor knowl- edge (end-users and operators for information technology, plant operators and their supervisors in the nuclear power plant context) where the problems are experienced, and also finishes at the shop floor where security measures have to be tested and implemented.

4. In order to enable groups to communicate, a certain amount of overlap in knowledge and experience is necessary. This implies re- dundancy. Groups are to be seen as consisting of nodes in a hologram, where each node con- tains a perspective of the whole system, and therefore has a great amount of redundancy, rather than as pieces of a jig-saw puzzle where each piece has its own fraction of information and where a “hole” appears if one piece is

missing.

The effectiveness of involving end-users and operators in the design and implementation of security and control measures has been proved empirically and is not based upon a fad for par- ticipation. Worthy of mention are two studies from which the following statements can be generalized

17,g1: [a] There is a strong correlation between the suc-

cessful introduction of innovations and the energy devoted to the preparatory stages of the design by end-users, i.e. those whose work environment is directly influenced by the in- novation.

5. Organisational slack (time and energy, thus pri- ority) is necessary to start building variety-gen- erating networks. In the nuclear power plant for instance, there is one supplementary shift, which is in training. Experience shows that the more involved the

various interest groups are (users, system desig- ners, operators) the more variance resistance the information systems get. This lesson unluckily is mostly learned during implementation time and much less throughout the design phases. The gen- erally accepted bureaucratic assumptions blurr this experience; most design methodologies require the completion of one project phase before starting the next one, especially in the later phases where complete specification is required.

[b] The major reason why “human errors” occur in the operation of complex technical systems is to be found in the lack of knowledge about the built-in controls of these systems.

Management has the task of creating conditions

To summarize, computer security is not an ac- tivity deployed by security specialists once a sys-

for adequate autonomous behaviour. Here it is necessary to point out what is meant by manage- ment. Indeed, there is a tendency of specialists to think that the right organisational place for their function is directly underneath the CEO. In the symposium held in Amsterdam in 1984 about computer security, a complaint commonly heard was that “higher” management ought to be made aware of security problems and should be the first target for security consultants and officers. The rationale is that if budgets have to be assigned to security measures, authorization has to be ob- tained from higher management because of the sums involved.

Page 5: Computer security — Prevention: Lessons from the operation of a nuclear power plant

126 L. Hoebeke / Computer security, lessons from a nuclear power plant

Practice shows that the best advocates for those with a practical tool, simulation. Some edp depart- budgets are not the specialists but those managers ments permit their members and even their end- who have a direct responsibility for the operations users to try to “break” the system so that much

and for the consequences of their malfunctioning. learning material can be assembled without the

The mixing up of responsibility with accountabil- need of learning it the hard way. This is a first

ity leads to the fact that security needs, if felt and example of what simulation can be. The focus

experienced by the operations management, are should not essentially be on the way the system

converted into politicai influence battles around has been broken (the more complex the system the higher management. A common approach, which more breaking possibilities there are) but on the in many cases backfires, is the so called consequences of the break for the various subsys- “catastrophic” approach using fear and anxiety. tems. This is normally less feasible with the “real”

Security is not to be “sold”; it is a basic human system. This type of simulation can easily be used

need for those who have to undergo directly the in prototyping environments. As a result some consequences of its lack. other spin-offs may be gained:

If autonomous transactions are essential for coping with unexpected events, the consequences of these transactions should not be permitted to reverberate through the whole system. In practice this means that to attain the variance-generating capability of the environments of the subsystems, the necessary uncoupling between subsystems should be built in.

[al

In many cases a false idea about economy of scale [9] and of the intrinsic worth of integrated systems, pushes information technology towards full integration by strongly linking subsystems. To obtain this integration, too much uniformity and standardization are essential, thus reducing the requisite control variety of these systems.

PI

Sufficiently interconnected systems have a tendency to show a 80/20 behaviour, i.e. 20% of the causes of breakdowns generate 80% of the breakdowns. When, after analysis, the con- sequences of these 20% can be determined, security measures and devices can be devel- oped for those. A great part of the conse- quences from the other 80% of causes will already have been taken care of in this way. Relying upon experience, relevant manual back-up procedures can be designed with a double purpose:

From the study of living systems in their en- vironments [lo] we know that the less diversified and the more strongly linked subsystems are, the less robust the whole system is and the less re- sistant to minor environmental changes.

to enable end-users and operators to learn about the internal control systems by point- ing at control sequences through manual simulation. This enhances the reliability of the socio-technical system.

In many cases the major risks involved in com- puter systems are built into the system itself. Un- coupling and thus building redundancy into the systems may be a much less expensive and more effective way to prevent catastrophic results from security breaks than continuously adding more safety devices a posteriori.

to train end-users and operators in break- down handling during the development phase. Permitting end-users and operators to cope with the childhood illnesses of the technical system will result in a more ade- quate response behaviour (adequate autono- mous transactions).

4.2. System Failures as the Raw Material for Oper-

ators and End-Users

4. Security Maintenance as an Ongoing Learning Activity

4.1. Simulation of Variances

In order to enhance the capability of internal variability for creating alternative response pat- terns, information technology itself provides us

If the concept of minimal critical specification design is followed and learning capability for the sociotechnical system is built in, the system’s breakdowns become the most relevant raw material to faster enable its growing reliability. Organisa- tional settings have to be foreseen in order to permit the use of this material not only by end- users and operators, but also by those who are at the base of the design criteria, management and system designers.

Page 6: Computer security — Prevention: Lessons from the operation of a nuclear power plant

L. Hoebeke / Computer security, lessons from a nuclear power plani 127

The complaint that too much work is being devoted to the maintenance of existing systems instead of to the development of new systems, a complaint often heard in edp departments, gives an indication of the way the law of requisite variety is operating in practice. The communica- tion link between the “complainer” and the desig- ner is filled with hurdles such as the weighing of the earnestness of the complaint, the internal pri- orities of the edp department, etc.

The success achieved by problem-solving groups, consisting at the same time of those who “know” the system and those who have “decision power” to change the system, has not only been noted in the nuclear power plant [4], but is also at the base of the successful implementation of qual- ity circles [ll]. The use of. communication tech- niques tried out by those groups [12,13] is an adequate means to enhance effectiveness in the use of failure data. In this way, the tendency to find for every security problem a mostly expensive technical solution, because the problem is shifted towards technicians only, can be avoided. Full use of the complementarity between the viewpoints of different competences and interest groups leads to cheap, understandable and accepted security mea- sures.

Where imperfection is a motivator for greater achievement, relevant management must create conditions (information and reward systems) for replacing the bureaucratic reflex of “scape-goat- ing” generated by the refusal to accept imperfec- tion and unexpectedness as normal human be- haviour.

5. Summary and Conclusion

The main emphasis in this article is on the imple- mentation of an ongoing security prevention sys- tem starting with the design phase of information systems and continuing through all other project phases throughout the life cycle of the system.

In fact, this is the method which is used in practice but because of organisational and cultural a priori’s indicated in the article, this process is mostly very inefficient. Some suggestions were given on how to start to eradicate the roots of this inefficiency. Much experience is already available indicating methodical ways of making the minimal critical specification design principles work [14].

In the future, the incompatibility of the require- ments of information technology (closed, complete specification) and the principles discussed here will become more apparent. The work done on heuristics [15] and artificial intelligence [16] for example, gives some indications of how to bridge the growing gap. Without the necessary action research in real-life socio-technical systems, much of the usefullness of that work for the purpose of designing more variance-resistant systems will get lost in purely technological fireworks.

References

[l] W. Acar and K.E. Aupperle: Bureaucracy as Organiza-

tional Pathology. Systems Research Vol 1, 3 (1984) Per-

gamon Press.

[2] S. Beer: The Brain of the Firm. 2nd Ed. Wiley, 1981.

[3] S. Beer: The Heart of Enterprise. Wiley, 1979.

[4] L. Janssens, L. Hoebeke and H. Michiels: The Application

of Cybernetic Principles for the Training of Operator

Crews of Nuclear Power Plants. Cybernetics and System

research 2, R. Trappl ed., Elsevier, 1984.

[5] R. Espejo: Management and Information: the Com-

plementarity Control-Autonomy. Cybernetics and Systems

Vol. 14, 1. 1983, Hemisphere Publishing Corporation.

[6] P.G. Herbst: Socio-technical Design: Strategies in Multi-

disciplinary Research. Tavistock, 1974.

[7] S. Alter, M. Ginzberg: Managing Uncertainty in MIS

Implementation Sloan Management Review, Vol. 20, 1.

1978 MIT.

[8] a) J.M. van Eekhout and W.B. Rouse: Human Errors in

Detection, Diagnosis, and Compensation for Failures in

the Engine Control Room of a Supertanker. IEEE Trans.

Syst. Man Cybem., Vol. SMC-11, 12. 1982.

b) W.B. Johnson and W.B. Rouse: Analysis and Classifi-

cation of Human Errors in Trouble Shooting Live Aircraft

Power Plants, IEEE Trans. Syst. Man Cybern., Vol. SMC-

12, 3. 1982.

[9] Goldhar and Jellinek: Plan for Economies of Scope.

Harvard Business Review 1983, 6.

[lo] T.F.H. Allen and T.B. Starr: Hierarchy: Perspectives for

Ecological Complexity. University of Chicago Press, 1982.

[ll] E.E. Lawler III and S.A. Mohrman: Quality Circles after

the Fad. Harvard Business Review 1985, 1.

1121 A.V. Feigenbaum: Total Quality Control. MC Graw-Hill,

1983.

[13] R. Fukuda: Managerial Engineering. Productivity, Inc.,

1983.

[14] L. Hoebeke: Linking Action Research and Operations

Research: the Use of O.R.-Models and Techniques in Organisational Interventions. Paper presented at the 10th

IFORS conference, Washington, 1984.

[15] a) D.B. Lenat: The nature of Heuristics. Artificial Intelli- gence Vol. 19. 1982 North-Holland.

b) D.B. Lenat: Theory Formation by Heuristic Search.

Artificial Intelligence Vol. 21, 1983 North-Holland. [16] A. Feigenbaum: The Fifth Generation. Panbooks, 1984.