Transcript
Page 1: Knowledge engineering versus software engineering

Data & Knowledge Engineering 5 (1990) 79-91 79 North-Holland

Knowledge engineering versus software engineering*

Fernando A L O N S O , Jose Luis MATI~ and Juan P A Z O S Faculty of Computer Science, Polytechnic University of Madrid, Spain

Abstract. This paper carries out a retrospective analysis o f t h e evolution of Software Engineering with reference to its techniques as well as its methodology. It then follows on to the development of Knowledge Engineering and considers the two approaches which were successively put forward: the paradigm of power as opposed to the paradigm of knowledge and their integration. Finally the existing similarities and differences between both KE and SE are estabfished. These similarities and differences are analysed in the following four fields: in the design and development of software products and technologies; in the tools and design techniques and building of software; in the architecture of the software system; and in the architecture of the hardware system.

Keywords: Knowledge engineering, Software engineering, Design techniques of software, Software architec- ture, Hardware architecture, Knowledge and power paradigms.

1. Introduction

When the organizers planned the 3rd Symposium on Knowledge and its Engineering they put forward tools as the "leitmotiv". The opening statement contained the following: Carlyle [5] said in his Sartor Resartus: " . . . men are animals who use t o o l s . . . They are nothing without them, but they are everything with t h e m . . . " . Carlyle's opinion is only another way of expressing what F. Bacon [1] had already expressed in his Novum Organum: that neither the hand nor the mind on their own have any great power; to carry out a task, instruments and aids are required which are as necessary to intelligence as to the hand. And in the same way as physical instruments accelerate and regulate hand movement, intellectual instruments make easier or discipline the course of the mind. This same idea, specifically applied to experimental sciences, was set out by the Nobel Prize winner Simon [28] in his keynote speech at the First Symposium on Knowledge and its Engineering. Then Prof. Simon said: "The development of any experimental science needs to have available the adequate tools which allow the researcher's mind to be powerful". Of course, he was referring specifically to tools such as ART, Knowledge Craft or Kee.

Nevertheless, the Persian fairy tale "The Three Princes of Serendip", can be applied to the 3rd Symposium. In this story the main characters are fortunate to have the power to often make unexpected discorveries, i.e. discoveries that are not looked for but come about accidentally, as a result of other circumstantial happenings. So when the main characters of this story were looking for something they always found something better. Horace Walpole [19], in order to class this form of undeserved luck, or similarly results reached which are opposite to those looked for, invented the word "serendipity", which appears for the first time in a letter which he sent to Mann in 1754, which obviously came from the title of the

* Introduction to the special section devoted to papers selected by the editors from the 3rd Symposium on Knowledge and its Engineering, held 17-21 October in Madrid, Spain.

0169-023X/90/$03.50 (~ 1990 - Elsevier Science Pubfishers B.V. (North-Holland)

Page 2: Knowledge engineering versus software engineering

80 F. Alonso et al. / Knowledge engineering vs. software engineering

story. Serendip or Serendib was the name given by the Arabs to what is known today as Sri Lanka and up until a few years ago Ceylon. In effect, in spite of the intentions of the organizers, the third Symposium dealt, almost exclusively, with the realtionships that exist or can exist between software engineering and knowledge engineering which had curiously been the "leitmotiv" of the previous Symposium. Fortunately, this unforeseen change of theme was dealt with the previously mentioned serendipitous effect and, therefore, its results were much better than expected.

The following is an analysis of the main points which were debated in this symposium and which are object, at present, of a very interesting investigation. Initially a retrospective analysis of the evolution of software engineering is carried out, with reference to its techniques as well as to its methodology, and later the development of knowledge engineer- ing is dealt with. Concerning this subject, its evolution is analyzed by taking the birth of cybernetics in the 1940s as a starting point and then going on to the Dartmouth Summer Research Project on Artifical Intelligence, celebrated in 1956 and finishing up with the focuses which were successively expounded on the paradigm of power and the paradigm of knowledge.

The main points are the existing parallels and deviations between the techniques of Software Engineering (SE) and Knowledge Engineering (KE). These similarities and differences are analized in four areas: - t h e design and development of software products and technologies - tools and design techniques and software construction - architecture of the software system, and - architecture of the hardware system.

2. Software engineering

Although there are discrepancies in the definition of software engineering [4, 8, 12, 29], in the present it implies the following:

1. The activity or function of a software engineer is directed towards managing, design, development, implementation and maintenance of software or focused on the controlled manipulation of the life-cycle of software.

2. The application of science, especially mathematics and heuristics with which the control mechanisms and electronic calculations are converted in useful elements for the creation of structures, machines and systems by way of computer programmes, procedures and their complementary documentation.

During the past decades we have grown to recognize circumstances that are collectively called "software crisis". Software costs escalated dramatically, becoming the largest dollar item in many computer-based systems. As software systems grew larger, quality became suspect. Individuals responsible for software development projects had limited historical data to use as guides and had less control over the course of a project. A set of techniques, collectively called "software engineering techniques", has evolved as a response to the software crisis. These techniques deal with software as an engineering product that requires planning, analysis, design, implementation, testing and maintenance [20].

SE techniques evolved as a reaction to the quickly increasing costs of software systems. At the beginning of the 1960s, large companies like General Electric, IBM and Mobil Research and Development Corporation began to develop ways to develop software projects as if they were engineering projects. Later, in 1968, the Nato Scientific Committee (NSC) met in Garmisch, F.R. Germany [16], to discuss international actions in the field of Computer Science. There, a study group was established, presided by F.L. Bauer, and its mission was

Page 3: Knowledge engineering versus software engineering

F. Alonso et al. / Knowledge engineering vs. software engineering 81

Table 1. Methodologies for SE.

Authors of the Proposed Phases

Brandon and Gray Boehm Freeman Metzger

Application Needs analysis Definition indentification and 0 project selection

System survey System requirements

Data gathering Software requirements Specification

System analysis Preliminary Architectural Design design design

System design Detailed design Detail design

Programming Code and Implementation Programming debug

System test Program testing Test and pre- operations

System testing Acceptance

Conversion and 0 Installation installation and operations

System maintenance

System evaluation

Operations Maintenance and maintenance

to evaluate the whole field. This group recommended that a work meeting should be held on SE, a title deliberately chosen to imply that the design, production and maintenance of software should be based on technical fundamentals and the traditional practical disciplines on established branches of those disciplines. The following year at the Nato Conference on SE [17] more than fifty software professionals from eleven countries met to discuss software engineering problems.

The issue at this conference was that building software is not very different from any other engineering task, and that the corrective actions in the building process correspond "grosso modo" to those of other disciplines. The conclusion was that SE should be modelled according to the paradigms and methods of already established engineering disciplines. In 1969, the NSC conference on SE techniques was devoted to a more detailed study of the technical problems in SE excluding management aspects. This conference emphasized the specification, quality, and flexibility of software; topics in the development of large systems, and education in SE.

In the 1970s this methodological focus on engineering was imposed and it was controlled by establishing "rules for the process" which develop over time. In the case of SE, they include specifying activities for each phase in the software life-cycle. These activities embody the rules for the processes. In summary, Table 1 [11] shows some of the methodologies developed during those years.

Although each one appears to have a different prescription for the development and implementation of software, when looked at in more detail, it becomes apparent that they are really very similar. It can be pointed out that the focus of Barry Boehm's methodology has a marked economic perspective as it implies the use of cost estimates of software systems as an aid in deciding on its development. In accordance with this economic focus SE has three main areas:

Page 4: Knowledge engineering versus software engineering

82 F. Alonso et al. / Knowledge engineering vs. software engineering

a. Analysis of cost~effectivity, which includes models to estimate the performance of a system in terms of the parameters of the system, production of functions to relate input with output and decision criteria to chose between alternatives.

b. Standard economic measures for "multigoals decision", such as estimates of actual values, marginal analysis and merit figures, which are combined with system analysis and evaluation techniques of goals in order to compare relative values of ways in which the goal of the developed system can be achieved.

c. Management of uncertainty and risk to combine an estimation of the value of the information with analysis of risk to simplify choices between options of system development when the opportunity of success is unknown.

The 1970s and 1980s were years of transition and a time of recognition of software. The decade of software is now up to us. In fact, advances in computing may become limited by our ability to produce quality software that can deal with the enormous capacity of 1990 - the age of processors.

Nevertheless, the objectives of software engineering must always be present and they are: -def ine a good methodology that addresses a software life-cycle of planning, development

and maintenance - establish a set of software components that documents each step in the life-cycle and shows

traceability from step to step - create a set of predictable milestones that can be reviewed at regular intervals throughout

the software life-cycle [20].

3. The development of knowledge engineering

The search to build machines which "resemble" people is as old as the historical tradition of humanity itself. In effect, even in the most remote past, men or their dreams have tried to create artifacts which emulate or at least simulate some of the facets of human beings and in particular their intelligence. The fact is that all the myths and achievements concerning automatons and "homunculi" are nothing more than the expression of an inevitable desire of human beings to surpass themselves in all aspects of life; in the physical or material aspect by the use of tools and the production of machinery; in the intellectual aspect by creating gadgets which devlelop or at least potentiate their intelligence; and finally in the metaphysi- cal aspect by creating beings similar to the human being "ex nihilo", that is to say, starting from the mind.

This is not the right moment to elaborate and to give details of all the steps which humanity has taken, in this direction, since it has cultural conscience of itself, as this has been dealt with in many texts [7, 23, 10, 21]. However, it would be relevant to point out that, at the beginning of the 1940s, it was possible to establish the inflection point at which the human aspiration to surpass itself passes from a futile one to a useful one, with the publication of three works relative to what at that time was known as cybernetics. In the first one, Wiener, Rosenblueth and Bigelow [22] of MIT, suggested different ways of giving "aims" and "goals" to the machines, i.e. to make them teleological, having as ultimate end the finding of a group of simple principles which will explain the activities of the human mind. In the second, Craik [6], of the University of Cambridge, proposed that machines employ models and analogies in solving problems, and expressed conclusively that instead of ones theory being as wide as reality, the fact is that ones perception of reality is as narrow as ones theory and for this reason both coincide. Finally, McCulloeh and Pitts [15] (MeCuUoeh belonging to the College of Medicine at the University of Illinois and Pitts to MIT), basing themselves on the works of Shannon, the first Kyoto prize winner, in which he modelled the

Page 5: Knowledge engineering versus software engineering

F. Alonso et al. / Knowledge engineering vs. software engineering 83

behaviour of electrical circuits using Boolean algebra, manifested how machines could use logical and abstract concepts and demonstrated how any law of input-output could be modelled on a network of formal neurons. Some years later the famous and provoking article of Turing [31] could be added to these three works in which he began in a very spectacular way proposing to examine the question: Can machines think?

With everything taken into account and despite the fact that those articles marked the beginning of the course of Artificial Intelligence (AI), later named as such by McCarthy [14], the reality was that the ideas contained in those works did not leave the framework of pure technical speculation until the mid-fifties. Then in Dartmouth College, McCarthy secured funds from the Rockefeller Foundation to organize the Dartmouth summer research project on AI. At this event, celebrated in 1956, he succeeded in getting together the ten most outstanding investigators of this area of knowledge: Bernstein, McCarthy himself, Minksy, Moore, Newell, Rochester, Samuel, Selfridge, Solomonoff and Simon, the Nobel Prize winner in 1978.

All those attending the meeting in one way or another hoped for the discovery, a hope previously cherished by Von Neuman [18] of some mental laws which would spring initially from the power of the computer and which would allow them to model dynamic, complex and qualitative problems. This gave rise to what was later called paradigm of power. This focus worked well at the beginning, developing quite a number of techniques and methods of representation as well as methods of search, but soon it was seen that the systems implemented following this strategy were not very efficient in practice. The focus changed when the investigators of this subject realized that men process with very little calculation, but with a large amount of structural knowledge. This new paradigm, which has its practical expression with the creation in the mid 1970s of the first expert systems DENDRAL, MACSYIMA, MYCIN, and PROSPECTOR, was named the paradigm of knowledge.

These two paradigms, the heuristics and the epistemological, make possible the recogni- tion of AI as a science and as a technology. As a science, AI deals with the study of intelligent behaviour, its ultimate aim being to achieve a theory of intelligence which can explain the behaviour of naturally intelligent beings. As a technology AI builds machines which resolve problems, the solving of which by humans shows them to be intelligent beings making good the statement by Vico [32]: "Certum quod factum". And as a result of this, 'only be sure of what you do', knowledge engineering emerges as an activity to build systems based on knowledge in general and in particular expert systems whose mission is to acquire, formalize, represent and use appropriately great amounts of knowledge of a very high quality specialized for a very specific task. This technological view of AI aims at giving human beings procedures, methods and techniques which will improve their intellectual capacities, automating routine tasks of thought and reasoning, and without belittling the advancement of computer studies as a science as well as a technology. The contributions which the laboratories and the persons dedicated to AI gave to computer science are well known and even notorious, and it would seem appropriate to mention the following: windows, icons, menus, time sharing, object programming, and the development of pro- totypes.

4. Parallelisms and differences between the techniques of SE and K E

4. I Design and development of software products and technologies

After looking at the development of KE and SE separately it is now the fight moment to examine them together as Druffell and Little in their article stated: "Of both SE and AI it can be said that their evolution has been a limited but productive symbiosis. Indeed, some

Page 6: Knowledge engineering versus software engineering

84 F. Alonso et al. / Knowledge engineering vs. software engineering

SE concepts hax;e been pioneered or supported by the AI research community. For example, concepts of reuse, specialized languages, and supporting environments were developed in support of AI research. In addition, AI techniques offer potential for supporting the SE process. However, in generating AI based products the approaches that work so well during the technology development and exploration phase are not appropriate for development. Products need to be more general and must support a broader audience, including those who are generally unfamiliar with AI concepts. Products must be capable also of evolving as user needs change. These are issues that AI practitioners currently need to address. The SE community on the other hand, has been dealing with these issues for over two decades. From these efforts, a collection of effective techniques and supporting technology has evolved to deal with building software products. Experience in application of those techniques indicate that, for AI technology to be applied successfully to products, those systems will do well to follow the same principles that the SE community is learning to apply to conventional software systems".

At the same time as the techniques of AI have become more mature, the tendency has been towards the making of products which include these techniques. These products are, as Druffell and Little point out in their already mentioned article, predominantly software, although consistently they are not recognised as such. This idea is shared by many others [33, 9], some even reaching the point of saying something as conclusive as the following: "Building computer-based information systems involves some basic tasks: problem detec- tion, identification and definition; solution definition (functional requirements); system analysis; logical and physical system design; procedure and programme design; procedure and programme writing; programme testing; integrated testing; conversion and installation; and operation. The organization of these tasks may change, but the tasks still must be performed".

In effect, as Druffell and Little point out "the AI research community has relied on prototypes to demonstrate the technology initially; however, when the concepts turn to products, the products tend to feature characteristics of more conventional software systems".

In this sense, and following the considerations of the Software Engineering Institute (SEI) of the Carnegie Mellon University for SE and applying them to KE, both Engineerings have the following issues in common:

a. Quality, including correctness, reliability, and performance. b. Managing the software and knowledge engineering process to achieve predictable costs

and schedules. The production of software is an extremely complex undertaking. Current practice often yields systems of lower quality than desired, with delivery behind schedule and over budget. The software is often part of a larger system, such as an airplane, whose cost may be orders of magnitude higher than that of software. If the deployment of an airplane or fleet of airplanes is delayed by software, the cost of the delay is considerably higher than the cost of the software. Thus, the effective management of the software and knowledge engineering process is more than just a software and knowledge engineering issue, it is a critical issue.

c. Productivity. The demand for new software has grown at an exponential rate in the last 20 years. That rate has not been matched by the rate of growth of the number of software and knowledge engineering professionals or their productivity. Unless the capacity to produce software can grow as fast as the demand, many needed systems simply will not be built. Increasing the number of software and knowledge engineers cannot be the major factor in increased capacity for two reasons. First, the nation's ability to educate and train new professionals cannot be substantially increased easily or quickly. Second, increasing the number of project personnel often creates more problems than it solves. Therefore,

Page 7: Knowledge engineering versus software engineering

F. Alonso et al. / Knowledge engineering vs. software engineering 85

individual productivity must be improved to effect a significant increase in software and knowledge production capacity.

d. The cost of producing software and knowledge based systems. Such systems are unique among engineered artifacts because their construction generally does not consume natural resources. In addition, production of multiple copies of the original does not require a costly manufacturing process. Thus, virtually all costs are labor related. Increasing the number of software and knowledge engineers on a project necessarily increases costs, while increasing productivity can reduce costs. It is necessary, therefore, to seek solutions that are technology intensive rather than labor intensive. This will be difficult because engineers have discovered that the tasks most conducive to automation are not the creative tasks that are prominent in software and knowledge engineering, but rather the repetitive tasks of manufacturing.

All these issues are closely related. Quality contributes not only to the overall value of a system but also affects the productivity of the developing and maintaining organizations throughout the product life-cycle. Similarly, effective management of the software and knowledge development process not only enhances predictability of costs and schedules but also establishes the basis for introducing technology-based methods that will enhance productivity.

Druffel and Little draw attention, with good reason, to the need to use more disciplined approaches in the development of KE products. On this point, the models of SE which go from the life-cycle waterfall model to the recent spiral model [3] can be adequate as the authors point out: "As products begin to incorporate AI techniques such as expert systems, it is useful, to bear in mind that these products are still software . . . . The development of software products based on AI, then, can benefit from the application of SE guiding principles and goals". Proposing a closer relationship between both engineerings.

Druffel and Little set out the contributions that SEI, to which they belong, will give to SE, in particular promoting the evolution of SE from an intensive work activity "ad hoc" to a managed, technology-supported discipline. They added process management as the third major area of importance, along with quality and productivity and to improve the maturity of SE as a practice. In this section on maturity the authors present a model of the SEI in five very original levels as well as the five actions that should be carried out in order for organization to better the maturity of their software.

Another of the contributions of SEI is that of technology in the sense of going for solutions of intensive technology using as strategy the reusability and methods of design based on mathematics. All this will be much more necessary when it affects distributed applications in real time, because of the additional complexity which these problems create.

Finally, Druffell and Little, echoing the old aphorism of "knowledge is power" [30] place education as the key to the future. This is also enhanced by KE [13] and by Servan-Schreiber [25] when he writes, quoting a certain president of the CMU (Carnegie Mellon University): "We put education at the very top of our national priorities". Certainly all those who practice either of the engineerings coincide in that without good preparation any technologi- cal practice is useless.

4.2 Tools, design techniques and the building of software

Haberman, Dean of the recently established School of Computer Science of CMU, as the main topic of his paper puts forward the idea that AI can learn from SE and vice versa. To demonstrate this he points out the contributions of SE to KE: system generation and maintenance, and project management, and the contributions of KE to SE: replacement of passive toolkits by interactive assistants, and building of flexible behaviour into systems and interfaces.

Page 8: Knowledge engineering versus software engineering

86 F. Alonso et al. / Knowledge engineering vs. software engineering

The SE objectives are, as Haberman indicates, to help create high quality software and to support the production process of software, and to do this the software engineers create tools, methods and techniques which guarantee a certain quality and lay the production process on firm bases, and they also attempt to measure the quality and effectivity of their own products. Habermann points out, as relevant to the construction of KBS, three main sub-tasks to build large software systems: programming-in-the-small, i.e. building individual programmes; programming-in-the-large, meaning the composition of systems out of mod- ules; and programming-in-the-many, which is the same as management of a project team. The first case can be seen as a one to one correspondence, the second as a relationship of one with many and the third as a relationship of many with many.

To continue, Habermann indicates the necessary tools and techniques to tackle these three different situations. For the first case he mentions: the abundance of language and editors, the use of programming environments concurrent with the associated tools, programming oriented towards objects, with three important characteristics of data encapsulation based on object classification, class inheritance and individualized object operations (called methods) and reusability employing parametrization techniques, programme transformation, inheri- tance, derivations of formal specifications, etc.

For the second situation, he points to the necessity of: tools for the generation of systems, similar to "make" and RCS, configuration techniques and control of system versions. The first deals with system descriptions in terms of specific modules of which these systems are made up, the control deals with variants, modifications and upgrades which inexorably emerge in a project of a large system, MIL (Module Interconnection Language) System Definitions, such as a collection of modules and subsystems. A MIL, in the first instance, is a language of interconnection of modules and a group of tools which operate on the objects described in that language, but it would be better if a MIL were an interactive programme environment, preferably concurrent, that takes most of the system configuration and version control out of the users' hands, giving way to System Instantiations.

Concerning Project Management, there are different opinions as to how to put this into effect. Some face the problem by using methodologies, the majority of which are specifically designed to support the life-cycle, others concentrate on mechanisms. Regarding mechan- isms, the most important issues every team project faces are: concurrency, access control and clustering.

Finally, Habermann points out that it is precisely from these three mechanism results that KE can learn from SE. Also, concerning the programming-in-the-small, the advice is to replace the text editor by a language editor, to adopt the object-oriented programming style and to provide a workbench for revising code. And with respect to the programming-in-the- large, it pays to have a formalism and an environment in which module interfaces and system configurations can be described. This environment can be interface checking and can also handle version control. A database providing atomic transactions is the proper underlying mechanism for these special purpose environments. He also points out that what SE should learn from KE is on the one hand to create intelligent assistants instead of toolkits, to build user-friendly iterfaces, to make more tolerant and flexible systems and add knowledge to the tools and their environment and on the other hand to base the design of the systems on a model of the task to execute which includes the users.

4.3 Architecture of the software system

Mary Shaw, renowned worldwide as the editor of the Curriculum for Computer Science Studies at CMU analyzes in her paper "Towards Higher-Level Abstractions for Software Systems", the new problems arising from the increase in size and complexity of the present

Page 9: Knowledge engineering versus software engineering

F. Alonso et al. / Knowledge engineering vs. Software engineering 87

software systems. Af te r pointing out the incapacity of the techniques in present use to solve these problems, she proposes as a solution to work at an organization level or with software architecture. This pape r is in some way a continuation of another [27] in which she set out the coming challenges of SE pointing to the scale effect in this engineering. It was then that she pointed out that software engineers should deal with complex systems in which software is only one of the many components in a large heterogeneous system and where software is expected to serve as a surrogate to a human programmer , taking an active role in the deve lopment and control of software systems. Those new modes of operat ion she described as ' p rogram-as-component ' and 'p rogram-as-deputy ' . Their relation to the other kinds of p rogramming of the 1960s, is suggested by Fig. 1.

In order to work at this level of architecture, Shaw indicates that new types of abstractions are necessary and proposes the following as the more viable: Programming oriented towards objects, Pipes and Filters, Lasered Systems and Rule-Based Systems and Blackboard Systems. I f there is any doubt about whether the paren thood of the programming oriented towards objects comes f rom the AI field or the SE field, there is definitely no doubt concerning the source of the last two architectures, based on rules and on blackboard, proposed by Mary Shaw. Both are normal structures in knowledge engineering, to the point that more than 90% of the Exper t Systems presently in routine use are based on rules, being paradigmatic in the cases of M Y C I N , D E N D R A L , X C O N , etc. Blackboard architecture is not used as much due to the fact that very few tools, A R T , in knowledge engineering can

1960 + 5 years 1970 + 5 years 1980 + 5 years 1990 + 5 years 1990 + 5 years Programming- Programming- Programming- P r o g r a m - a s - Program-as-

Attribute any-which-way in- the.smaU in-the-large component deputy

Character- Small programs Algorithms and Interfaces, Integration of Incorporation ist/c programming management, heterogeneous of judgement Problems system structures components

Data Representing Data structures Long-lived da t a Integrated da t a Knowledge issues structure and and types bases, symbolic bases, physical representation

symbolic as well as numeric as well as symbolic information

Control Elementary Programs Program assemblies Control over Programs learn i s s u e s understanding execute once execute continually complex physical from own behavior

of control flow and terminate sysems

Specifica- Mnemonics, Simple Systems with Software as Extensive t/on precise use input-output complex component of reuse of issues of prose specifications specifications heterogeneous design

system

State State not well Small, simple Large structured Very large state with State includes space understood apart state space state space dynamic structure development as

from control and physical form well as application

Manage- None Individual effort Team efforts, Coordination of Knowledge about merit system lifetime integration and application domain focus maintenance interactions and development

Too/s and Assemblers, Programming Environments, Tools for real-time Program generators methods core damps languages, integrated tools, control, dynamic expert systems,

compilers documents reconfiguration learning systems linkers, loaders

Fig. 1. Emergence of software problems.

Page 10: Knowledge engineering versus software engineering

88 F. Alonso et al. / Knowledge engineering vs. software engineering

support it but the case of HEARSAY II and III, MOLGEN, etc. demonstrate its im- portance.

Later, M. Shaw, in order to effectively design architecture analyzes and describes the composition of systems starting from subsystems, establishing that it is possible to obtain various different implementations by combining in different ways these subsystems. To support the development of software architecture, Shaw points at two areas of work, one mentioned by Habermann, MIL, and the other the reference models which are developed to provide a frame to describe architectures and to show how the compofients relate between themselves, defining the external properties of a subsystem without pre-empting design decisions concerning its implementation. Finally, and in contrast to other engineering disciplines that codify and spread the joint understanding of design, implementation strategies and standards via manuals which serve as repositories of information and as vehicles for the conceptual structure of the field, SE should use another form and contents to code and disseminate the comparative knowledge which it uses for software archictectures, based on electronic publishing.

4.4 Towards the integration of SE and KE: the expert system in dominoes and heterogeneous machines

The underlying thesis in all the papers presented here is the necessity for a synthesis to overcome the polarization, and sometimes confrontation which exists between SE and KE. However, both are complementary and necessary.

In this respect, Borrajo, Pazos, Perez and Rios built an expert system for playing dominoes using both approaches and this is presented in their paper Dominoes as a domain where to use proverbs as heuristics.

In this paper, as well as pointing out the use which, for the first time, was made of proverbs as a heuristic meta-knowledge, they also describe how a joint SE and KE approach was used to develop the system. In effect, in the stage where the problem is contemplated from the point of view of the user, AI and KE techniques are used and in the stage where the problem is viewed from the perspective of the computer SE techniques are employed.

The results could not have been more fruitful as the system wins more than 80% of the games it plays, which makes it the best player in the world. This was confirmed when it took part in the first Computer Olympiad celebrated in 1989 in London where it received the Gold Medal.

Naturally, in order for this integration to become effective adequate hardware architec- tures are necessary. Perhaps this necessity will be satisfied by way of, what Barbacci [la] calls, "heterogeneous machines", or heterogeneous computer networks. These networks are concerned with allocating specialized resources to the task of medium to large size. They need to create processes, allocate these processes to processors and specify the communica- tion patterns between processes.

Almost by definition, the KBS, especially the cooperatives and/or the distributed, are implemented on environments of computation made up of networks of general purpose processors and special loosely connected ones. As he says: "These machines are of special interest to developers of real-time, embedded applications in which many concurrent, large-grained tasks or programs cooperate to process data obtained from physical sensors, make decisions based on these data, and send commands to control motors and other physical devices. During execution time, these tasks are instantiated as concurrent processes, running on possibly separate processors and communicating with each other by sending messages of different types".

Barbacci affirms, logically, that to develop software for heterogeneous machines is

Page 11: Knowledge engineering versus software engineering

F. Alonso et al. / Knowledge engineering vs. software engineering 89

qualitatively different from developing software for conventional processors. To back up his statement, firstly he characterizes, following Bell and Newell [2], a heterogeneous machine as a PMS ("Processor-Memory-Switch"), in contrast with the conventional ISP ("Instruc- tion-Set-Processor"). With reference to PMS, each component is specified in terms of its function, speed and capacity and other similar attributes. So that in PMS, there is no equivalent to an interpreter of instructions, the majority of the activities being directed by data. Here the processors fetch their operands which are elements of a data queue, only and when they are ready for processing and for sending their results to the appropriate queues to be consumed by other networks.

Naturally, this new focus needs new language. For this reason Barbacci presents Durra as a language to support the building of distributed applications using concurrent tasks of a large grain to be executed on heterogeneous machines. The PMS program, as is im- plemented with Durra, is made up of a "specification" or description of the task and its satisfaction or selection of the task. For this reason the Durra prototype implemented is made up of three elements: the verifier which selects the task and the description type from the library and produces a group of instructions for the scheduler. This second element, the scheduler, uses these instructions to direct the execution of the tasks on a heterogeneous machine, this is done with the aid of a server process running in each processor of the machine. The third element, the emulator of the task, is a prototype tool which allows the user to configurate and experiment with an application even before all his programmes have been developed.

Barbacci finally analyzes the interaction between the program PMS and the development methodol.ogy of software represented by the spiral model of Boehm [3]. This model is a refinement of the classical waterfall model, providing, by successive applications of the original model, requirements, design, development, verification, etc., until it achieves, progressively, more specific versions of the final product. This methodology has as a principal advantage the premature identification of areas of uncertainty which contribute relevant risk sources and it adjusts very naturally to the PMS paradigm.

5. Conclusions

It is evident that we are immersed in the era of knowledge and that power is intimately related to specialized knowledge which exists in the different technological fields. This affirmation is especially valid when we confine ourselves to the field of Computer Science engineering, whose development is quickly being directed towards this line of thought. For this reason it would seem appropriate to take the following actions:

• Present hardware architecture became obsolete in dealing with the immense quantity of information which implied the taking of diffuse nonstructured and uncertain information. It would seem evident that the heterogeneous machine that processes in parallel functions directed by data, instead of the conventional IPS, is the line of development of this architecture. • Software architecture should acquire greater levels of abstraction and reusability, with implementations independent from the machine, hence making it portable. Architecture oriented towards objects would seem to be the motive of this development. At the same time, on incorporating more knowledge into this software in order to develop the applica- tion, the achitecture should have at its disposal mechanisms of representation and utilization of this knowledge, the system being based on rules, in order to be accepted more readily. This is where the major efforts of investigation should be directed.

Page 12: Knowledge engineering versus software engineering

90 F. Alonso et al. / Knowledge engineering vs. software engineering

• The tools to approach the development of software ought to make it possible: to grasp knowledge easily from the expert; to utilize heterogeneous information; and to incorporate specialized knowledge in its design. It would seem timely to encourage the design of specialized language with a specific purpose, instead of increasing the development of software packages for a precise application. As an example of this tendency it is relevant to point out the MIL system. • KE does not yet rely on a methodology to direct the life-cycle of an application in its surrounding. This deficiency limits the development of large applications, which is what has happened in the SE field, and until a methodological structure for development has been settled, it is not possible to introduce a standardized mode of production. In this respect, the editors have developed, within the scope of KE, the IDEAL methodology (Identification, development of the prototype, construction and execution of the complete system, tech- nological transfer and system maintenance) which is similar to the spiral model which is supported easily by heterogeneous machines. This methodology which takes into account both the user's and the computer's approach, demonstrated its efficiency by designing and constructing an ES for playing dominoes which reached maximum services in all the tests it was subjected to.

We consider that these general lines of action are the ones which will lay down the guidelines of this engineering in the development and implementation of large software systems, going from the present implementation of prototypes and small systems to large scale ones. All things considered, it is not Software Engineering versus Knowledge Engineer- ing in confrontation but Software Engineering versus Knowledge Engineering complemen- tary to one another, resulting in greater possibilities for both. This was the main conclusion of the symposium.

Acknowledgement

This paper was prepared and written with the collaboration of CETYICO (Centre for Technological Transfer of Knowledge Engineering, Spain).

References

[1] F. Bacon, Novum Organum (Sarpe, Spain, 1984). [la] M.C. Barbacci, C.B. Weinstock and J.M. Wing, Durra: Language support for large-grained parallelism, Proc.

Internat. Conf. Parallel Processing and Applications, l'Aguila, Italy (Sept. 1987). [2] C.G. Bell and A. Newell, Computer Structures- Readings and Examples (McGraw-Hill, New York, 1971). [3] B.W. B6ehm, A spiral model of software development and enhancement, Computer 21 (5) (May, 1988). [4] F.E Brook, Jr, The Mythical Man-Month, Essays on Software Engineering (Addison-Wesley, Reading, MA,

1982). [5] T. Carlyle, Collected Works, Sartor Resartas (Chapman and Hall, London, 1795-1881). [6] K.J.W. Craik, The Nature of Explanation (Cambridge Universities Press, Cambridge, U.K., 1943). [7] G. Elgozy, Origines de L'Informatique (Techniques de L'Engenieur: Informatique (H1), Paris, 1985). [8] R. Fairley, Software Engineering Concepts (McGraw-Hill, New York, 1985). [9] E.N. Fong and A.H. Golfine (eds.), Data Base Directions: Information Resource Management- Making it

Work, NBS Special Pub. 500139 (National Bureau of Standards, Gaithersburg, MA, 1986). [10] Homerus, lUiad (1986 edition). [11] A.L. Johnson, Experts Disagree on Meaning of Software Engineering, Part I, Bridge Vol. 2 (Software

Engineering Institute, CMV, Pittsburgh, PA, Aug/Sept. 1986). [12] H. Ledgard with J. Tower, Professional Software, Vol I; Software Engineering Concepts (Addison Wesley,

Reading, MA, 1987).

Page 13: Knowledge engineering versus software engineering

F. Alonso et al. / Knowledge engineering vs. software engineering 91

[13] J.L. Mat6 and J. Pazos, Ingenieria del Conocimiento (SEPA. C6rdoba, Argentina, 1988). [14] P. McCorduck, Machines Who Think (Freemann and Co., San Francisco CA, 1979). [15] N.S. McCulloch and W. Pitts, A logical calculus of the ideas inmanent in nervous activity, Bull. Math. Biophys.

5 (1943). [16] NATO, Proceedings (Munich, 1968). [17] P. Nauer and B. Randell (eds.), Software Engineering, Report on a conference sponsored by the NATO

Science Committee, Garmisch, Germany, 1968). [18] J. Neumann von, The Computer and the Brain (Yale University Press, New Haven, CT, 1958). [19] The Compact Edition of the Oxford English Dictionary, Vol. II (Oxford Universities Press, Oxford, 1987). [20] Roger S. Pressman, Software Engineering: A Practitioner's Approach (McGraw-Hill, New York, 1982). [21] J. Pazos, Intelegencia Artificial (Paraninfo, Madrid, 1988). [22] A. Rosenblueth, N. Wiener and J. Bigeiow, Behaviour, Purpose and Teleology, Philosophy of Science 10

(1943). [23] O. Seeman, Mitologla Cl~sica llustrada (Vergara Barcelona, Espafia, 1958). [24] SEI Programme Plans, Software Engineering Institute, Carnegie Mellon University, Pittsburgh, PA, 1987. [25] J.J. Servan-Schreiber, The Knowledge Revolution (1987). [26] M. Shaw (ed.), The Carnegie-Mellon Curriculum for Undergraduate Computer Science (Springer, New York,

1985). [27] M. Shaw, Beyond programming in-the-large: The next challenges for software engineering, Annual Technical

Rev. SEI-CMU (1985). [28] Magesterial Conference of the First International Symposium on Knowledge Engineering, J. Pazos (Trans-

lator) Rank Xerox on Expert Systems, Nov. 1986. [29] I. Sommerville, Software Engineering (Addison Wesley, Reading, MA, 1985). [30] S. Tse, The Art of War (Charenton Press, Oxford, U.K. 1963). [31] A.M. Turing, Computing machinery and intelligence, Mind IX (236) (1950). [32] G. Vico, The New Science (1874), T.G. Bergin and M.M. Fish (Transl.) (New York, 1962). [33] J.R. Weitzel and L. Kerschberg, Developing knowledge based systems: reorganizing the systems development

life-cycle. Commun. ACM. 32 (4) (April, 1989).


Recommended