22
Technion - Computer Science Department - Technical Report CIS9504 - 1995

Computer Science Technion - Computer Science Department ...€¦ · CIS Rep ort #9504 y Ma 1995 or F the elimination of y an doubt, it is y hereb stressed that the sta b memer and/or

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Computer Science Technion - Computer Science Department ...€¦ · CIS Rep ort #9504 y Ma 1995 or F the elimination of y an doubt, it is y hereb stressed that the sta b memer and/or

Technion - Israel Institute of TechnologyComputer Science DepartmentUnsupervised Learning of Finite Automata: A Practical ApproachD. Carmel and S. MarkovitchCIS Report #9504May 1995

For the elimination of any doubt, it is hereby stressed that the sta�member and/or the Technionand/or the Technion Research and Development Foundation Ltd. will not be liable for any propertydamage and/or corporeal damage and/or expense and/or loss of any kind or sort that will be causedor may be caused to you or to anyone acting on your behalf, in consequence of this statement ofopinion or this report, or in any connection to it.Tec

hnio

n -

Com

pute

r Sc

ienc

e D

epar

tmen

t - T

echn

ical

Rep

ort

CIS

9504

- 1

995

Page 2: Computer Science Technion - Computer Science Department ...€¦ · CIS Rep ort #9504 y Ma 1995 or F the elimination of y an doubt, it is y hereb stressed that the sta b memer and/or

Unsupervised Learning of Finite Automata: A Practical ApproachDavid Carmel and Shaul MarkovitchComputer Science DepartmentTechnion, Haifa 32000IsraelEmail: [email protected], [email protected] learning of �nite automata has been proven to be NP-hard. However, thereare many real situations when such learning is necessary. The work presented here studiesthe possibilities of using heuristic methods for unsupervised learning of �nite automata. Analgorithm for learning a �nite-state machine based on a sample of its behavior is presented. Thealgorithm is based on Angluin's L� algorithm. When L� receives a counterexample it extends itscurrent model to agree with the example. It then consults a teacher for resolving uncertaintiesin the extended model. Our algorithm, instead of using a teacher, resolves the uncertaintiesby consulting its own model that was generated during the previous learning step. A set ofexperiments that show the potential merit of the algorithm is presented.1 IntroductionThe problem of inferring �nite automata (DFA) from its input/output behavior is a well knownstudy in the machine learning literature, and continues to generate new interest. Finding thesmallest �nite automata consistent with a given sample of machine's behavior, has been shown tobe NP-Hard [Gol78, Ang78]. It has also been shown that the minimal consistent automata can notbe approximated within any polynomial time algorithm [Pit89]. Furthermore, Kearns and Valianthave shown that, even without restrictions on the model representation of the learner, modelingautomata from input/output behavior is as hard as factoring Blum integers, inverting RSA, ordeciding quadratic residues [KV89].Thus, passive modeling of a given automaton from arbitrary sample seems to be infeasible.Angluin [Ang87] describes an algorithm that e�ciently infers an automata model using a `minimaladequate teacher', an oracle that answers membership and equivalence queries. This algorithm,named L�, is an e�cient inference procedure that constructs a minimal DFA consistent with thelearned machine. The computational time is polynomial in the number of states of the machine,and the longest counterexample supplied by the teacher.Another alternative was studied by Rivest and Schapire [RS89]. Their procedure simulatesiterated interactions of a robot with an unknown environment, and is based on L�. Instead of anadequate teacher that answers queries, the learner is permitted to experiment with the environment(machine). When the learner is requested by L� to simulate a membership query for a speci�cinput, it operates the machine and observes the result. If the model's prediction is di�erent fromthe actual behavior of the machine, the learner treats it as a counterexample. To experiment withthe machine, the learner is required to have an ability to reset it. In the case of an unresetable2Tec

hnio

n -

Com

pute

r Sc

ienc

e D

epar

tmen

t - T

echn

ical

Rep

ort

CIS

9504

- 1

995

Page 3: Computer Science Technion - Computer Science Department ...€¦ · CIS Rep ort #9504 y Ma 1995 or F the elimination of y an doubt, it is y hereb stressed that the sta b memer and/or

machine, Rivest and Schapire's algorithm uses `homing sequences' instead, a special sequence ofactions that according to the machine's outputs, determines the �nal state of the machine. Givenan input parameter � > 0, their procedure approximately identi�es a minimal consistent DFA, witha probability of at least 1� �, in time polynomial in the machine's size, the length of the longestcounter example, and in the logarithm of 1� .In this work, we investigate the ability to model an unknown machine from its input/outputbehavior, without the presence of a teacher and with a very limited ability to experiment withthe machine. Our framework simulates a situation where an agent is required to interact with agiven machine, and the machine's behavior has to be modeled to discover an e�cient interactivestrategy. In contrast to Rivest and Schapire's procedure, experiments with the machine might betoo expensive or even destructive for the learner. The agent has a sample of the machine's behaviorin the past. It can also choose its actions during the interaction process to learn the machine'sbehavior. Actions selection can be e�ected from the requirement to act optimally at the currentstage of interaction, but also from the intention to explore the machine.Our inference procedure is incremental. We assume that the machine can be modeled as aDFA. At any stage of interaction, the learner exploits the machine's current model to predict themachine's behavior, and chooses its own action according to the model's prediction. When model'sprediction is wrong, the agent updates the machine's model to become consistent with the newcounterexample. The learning procedure performs a hill climbing search in the hypothesis space forthe minimal model that is consistent with the machine's behavior in the past, and also with the newcounterexample. Following the `Occam razor' principle, we search for the minimal DFA consistentwith the machine's behavior, using heuristic methods that look for the smallest consistent model.The remainder of the paper is organized as follows: Section 2 de�nes a framework for theresearch and describes the related theoretical results that our work is based on. In section 3 wedescribe and analyze US-L*, an unsupervised version of L* that interactively and unsuperviselyinfers a model of an unknown machine, consistent with its behavior. Instead of a teacher, US-L* answers membership queries by using its own heuristics, and by treating wrong predictions ascounterexamples. We also outline preliminary results of various experiments with the algorithm. It seems that the success of US-L* to construct a reasonable model, depends especially on thesample type. For pre�x-closed samples, results are quiet promising. For arbitrary samples, modelsize can grow up to be the same as the sample size. Section 4 concludes and discusses future work.2 Background2.1 De�nitionsA deterministic �nite automata, DFA, sometimes called a �nite state machine FSM, is a 6-tupleM = (Q;�in; �; q0;�out; F ), where:� Q is a �nite non-empty set of states.� �in is the input alphabet. In our notation is the action set of the agent.� � : Q � �in ! Q is a transition function. We extend � to the domain Q � ��in in the usualway:�(q; �) = q�(q; s�) = �(�(q; s); �)� denotes the null string, s 2 ��in and � 2 �in.3Tec

hnio

n -

Com

pute

r Sc

ienc

e D

epar

tmen

t - T

echn

ical

Rep

ort

CIS

9504

- 1

995

Page 4: Computer Science Technion - Computer Science Department ...€¦ · CIS Rep ort #9504 y Ma 1995 or F the elimination of y an doubt, it is y hereb stressed that the sta b memer and/or

� q0 is the initial state of the DFA.� F : Q ! �out is an output function. The DFA is interpreted as a Moore machine. In ournotation F returns the output of the environment, given it is found in state q.We de�ne the output of a DFA M on a string of input actions s 2 ��in as M(s) = F (�(q0; s)).For two strings s1; s2 we denote the concatenation of them as s1s2.An example of the machine's behavior is a pair (s; �) where s 2 ��in, � 2 �out. � annotatesthe output of the machine for an input sequence s. A Sample D is a �nite set of examples of themachine behavior. For any example (s; �) 2 D, we mark � as D(s). We say that a model M isconsistent with a sample D i� for any example (s; �) 2 D, M(s) = D(s).We say that the learner has a perfect model of the machine if it can predict perfectly its outputgiven any sequence of input actions. Our goal is to construct a perfect model of an unknownmachine given a sample of its behavior.2.2 Identifying a DFA from a Given SampleGold [Gol72, Gol78] studied the problem of identifying a DFA from a given sample by representinga DFA by an observation table (S;E; T ). S is a pre�x-closed set of strings. Each element s of Srepresents the state �(q0; s). The transition function is represented by S�in = fs�js 2 S; � 2 �ing.It is possible that two elements of S represent the same state. E is a su�x-closed set of strings,called tests, that distinguish between states of the machine. An answer for test e on string s isde�ned as F (se). Two strings s1; s2 2 S represent the same state i� their answers are equal forall tests in E. T is a two dimensional table, with one row for each element of S [ S�in, and onecolumn for each element of E. An entry of the table, T (s; e), records the output of the machine forthe string se, F (se).A table is closed i� for any given s 2 S�in there is a row s0 2 S such that row(s) = row(s0). Atable is consistent i� for any two equal rows in S, s1; s2, and for any � 2 �in, row(s1�) = row(s2�).We say that a DFA M is consistent with an observation table (S;E; T ) i� for any entry (s; e) in T ,M(se) = T (s; e).A DFA M that is consistent with a closed and consistent table M = M(S;E; T ) , can beconstructed as follows [Ang87]:� Q = frows(s) : s 2 Sg� q0 = row(�)� �(row(s); �) = row(s�)� F (row(s)) = T (s; �)Theorem 1 (Angluin) If (S;E; T ) is a closed and consistent observation table, then the DFAM(S;E; T ) is consistent with the table T, and any other DFA consistent with T but inequivalent toM(S;E; T ), must have more states.We say that a table (S;E; T ) covers a sample D, if for any (d; �) 2 D, there is an entry (s; e) 2 Tsuch that d = se and T (s; e) = D(d). We say that a table entry (s; e), is supported by a sample D,if there is an example d 2 D such that d = (se; �), and T (s; e) = �.4Tec

hnio

n -

Com

pute

r Sc

ienc

e D

epar

tmen

t - T

echn

ical

Rep

ort

CIS

9504

- 1

995

Page 5: Computer Science Technion - Computer Science Department ...€¦ · CIS Rep ort #9504 y Ma 1995 or F the elimination of y an doubt, it is y hereb stressed that the sta b memer and/or

Given a closed and consistent table (S;E; T ) that covers D, we can construct M(S;E; T ) con-sistent with D. Thus, the problem of �nding a consistent DFA with D is reduced to the problemof �nding a closed and consistent observation table that covers D.Given a sample D, assume that we try to construct a table (S;E; T ) that covers D. It is easyto �nd S and E such that for any d 2 D, there are s 2 S, e 2 E and d = se. Table identi�cationdemands the learner to �lls all table entries. Table entries that are supported by D must be �lledby T (s; e) = D(se). The main question left is how to �ll entries not supported by D.De�nition 1 An entry (s; e) of the table (S;E; T) will be called a permanent entry if T (s; e) =D(se). (s; e) will be called a hole entry if T (s; e) has to be guessed by the learner.Two table entries, (s1; e1) and (s2; e2) will be called tied if s1e1 = s2e2.De�nition 2 An assignment is a vector of output values, that assigns an output value into eachhole of a table. An assignment will be called a legal assignment i� tied holes get the same value.Finding a legal assignment for a given table is easy. For example, the assignment that insertsthe same output value for all the holes must be legal. We will call it the trivial assignment. Theproblem becomes much harder if we look for a legal assignment that causes minimal extensions ofthe table for bringing it to become closed and consistent. We will call such assignment an optimalassignment.Theorem 2 (Gold) The problem of �nding an optimal assignment for an observation table thatcovers a given sample, is NP-hard.2.3 Supervised Learning of a DFAFollowing Gold's results, an exhaustive search for an optimal assignment for a given table istoo hard. Angluin [Ang87] shows how to infer a structure of any machine, in the presence of a`minimal adequate teacher'. The teacher directs the learner to �ll hole values optimally by answeringmembership queries. The learner is allowed to ask the teacher membership and equivalence queries.On a membership query, the algorithm asks for the machine's output for a given string of inputactions s. On an equivalence query, the algorithm conjectures that the model and the machine areequivalent. The teacher replies by `Yes' for a right conjecture, or provides a counterexample m, astring which the model and the machine disagree.The L� algorithm maintains an observation table (S;E; T). Initially, S = E = f�g and alltable entries are �lled by membership queries. In the main loop, L� tests the current table forcloseness and consistency. If not, L� extends the table to become closed and consistent, where newtable entries are �lled by membership queries. When the table becomes closed and consistent, L�constructs M(S;E; T ) and asks for equivalence. If a counterexample is provided by the teacher,the table is extended to include the new example. The algorithm continues until `YES' is repliedby the teacher for the conjectured model.Theorem 3 (Angluin) L� eventually terminates and outputs a minimal equivalent DFA to theTeacher's DFA. Moreover, if n is the number of states of the teacher's DFA, and m is an upperbound on the length of the longest counterexample provided by the teacher, then the total runningtime of L� is bounded by a polynomial in n and m.5Tec

hnio

n -

Com

pute

r Sc

ienc

e D

epar

tmen

t - T

echn

ical

Rep

ort

CIS

9504

- 1

995

Page 6: Computer Science Technion - Computer Science Department ...€¦ · CIS Rep ort #9504 y Ma 1995 or F the elimination of y an doubt, it is y hereb stressed that the sta b memer and/or

Another version of L� is also used by a learning procedure described in Rivest and Schapire[RS89]. The learner holds a model of the machine's behavior, based on a closed and consistentobservation table, and compares the model's prediction to the actual behavior of the machine.When model's prediction is wrong, the learner treats it as a counterexample and extends the tableto cover the new example exactly as L� does. Instead of a teacher, their algorithm optimally�lls new holes by conducting experiments with the machine. For �lling an entry (s; e), the actionsequence se is provided to the machine and the machine's output is inserted into the hole.3 Unsupervised learning of a DFA: a heuristic approachAn adequate teacher, or an unlimited ability to experiment with the machine, do not exist in mostpractical situations. What should an agent do in the case when a model of the environment'sbehavior is essential for its success, and when it does not have a teacher or experimentation ability?The theoretic results covered in the previous section are disappointing in the sense that inferring areasonable model for the environment's behavior seems to be too computationally hard. Construc-tion of a naive model that covers the data is possible but does not seem to bring much bene�t. Forexample, using the trivial assignment for �lling an observation table that covers the sample wouldprobably cause the model to be in size of the given data, and therefore, no generalization be made.We propose to deal with this problem by considering heuristic approaches that search for anoptimal assignment for a given observation table. According to the `Occam Razor' assumption,we should look for the smaller model consistent with the given data, in order to receive a modelwith high prediction power. By using heuristics that try to control the growth of the model duringextension, and by taking some limiting assumptions on the given data, we show that a reasonablesolution might be found.This method is analogus to the famous classi�cation problem of constructing a decision treefrom a set of pre-classi�ed examples. Finding the smallest decision tree consistent with a givendata is known to be NP-Hard [QR89]. However, using heuristic methods such those described in[Qui86], a `reasonable' consistent decision tree can be constructed, in time polynomial in the sizeof the given data. This method has been shown to be e�cient in many practical applications.The following subsection describes US-L*, an unsupervised version of L� that interactivelyconstructs a model of an unknown machine from its input/output behavior. At any stage of thelearning session, when a counterexample arrives and the table is extended, holes are �lled by thecurrent model in hand, but attempt is made to change hole values in order to control the growthof the table.3.1 Description of the Learning AlgorithmDuring encounters with the environment, the learning agent holds a consistent model with theenvironment's behavior in the past, and exploits the model to predict the environment's behaviorin the future. The model M is based on a closed and consistent observation table (S;E; T) thatcovers past examples, and M = M(S;E; T ). When a new example arrives, it can be a supportingexample or a counterexample. For the �rst case, the algorithm changes hole entries in the observationtable, supported by the new example, to become permanent entries. This operation does not changetable values so the current model is not changed either. For a counterexample, two cases have tobe considered. When the new example contradicts hole values of the table, the algorithm changesthese values to become consistent with the example, and marks these entries as permanent entries.6Tec

hnio

n -

Com

pute

r Sc

ienc

e D

epar

tmen

t - T

echn

ical

Rep

ort

CIS

9504

- 1

995

Page 7: Computer Science Technion - Computer Science Department ...€¦ · CIS Rep ort #9504 y Ma 1995 or F the elimination of y an doubt, it is y hereb stressed that the sta b memer and/or

When a counterexample is not covered by the table, the algorithm extends the table to cover itin the same way as L� does. Following that, the algorithm arranges the updated table to becomeclosed and consistent again, and constructs a new model consistent with the new table.New entries of the extended table are �lled as follows: When an entry is supported by a pastexample, it gets the example's output value and it is marked as a permanent entry. When a tableentry is not supported by a past example, it gets an output value predicted by the current model,and it is marked as a hole entry. This approach guaranties that at any stage of the learning process,the observation table covers the past examples of the environment's behavior, and the assignmentfor hole entries of the table is always legal. This claim can be proved by the fact that for any pairof tied entries, if both are permanent entries, both are supported by the same example. If both arehole entries and both were added to the table at the same stage, then they both got their outputvalue from the same model. If one of them is an old entry and one was added later, the new onewill get the same value as the second. The current model is consistent with the table, so it mustassign the same value for the new entry as the old entry's value.In spite of the consistency of the accepted model with past behavior of the environment, itssize grows without limitations and its success to predict the environment's behavior in future isdisputable. The guiding principle of the algorithm is to limit the growth of the model as muchas possible. When the algorithm extends the table to cover a new counterexample, it attempts tochange holes assignment such that the new example will be covered with minimal needed extensions.If two rows, s1 and s2, were equal before the new example arrived, and are not equal after table'sextension, the algorithm tries to change the assignment to keep the rows equal, and to preventthe necessity to add new states for the current model. The rule for changing the assignment isquiet simple. If two equal rows, s1,s2, are not equal after extension, there is a a test e such thatT (s1; e) 6= T (s2; e). If one entry is permanent and the second is a hole, the hole entry get theoutput value of the permanent. When both are hole entries, the longer one get the output value ofthe shorter one. Changing a value of a hole entry, causes all its tied entries to get the same value,for keeping the legality of the assignment, but it might also cause other rows to become unequal.The same heuristic is operated when consistency of the table is checked. In the case that thetable is not consistent after extension, there are two equal S-rows, s1 and s2, and � 2 �in, suchthat row(s1�) 6= row(s2�). It means that there is a test e such that T (s1�; e) 6= T (s2�; e). L�solves this inconsistency by adding a new test �e into E, an extension that separates row(s1) androw(s2) and causes an addition of a new state to the current model. Before adding a new testand separating the two rows, US-L* tries to solve the inconsistency by changing hole assignment.When T (s1�; e) 6= T (s2�; e), hole values are changed according to the same rules as above. Addingthe new test is done only after the failure to solve inconsistency by changing hole values. Figure 1shows a pseudo code of the algorithm.The suggested heuristic is implemented easily but it is not the only one possible. For furtherresearch, our intention is to continue and look for other ones, their attitudes, and the situations inwhich they succeed to control the size growth of the learned model.3.2 Correctness of US-L*Theorem 4 If (S;E; T ) is a closed and consistent observation table that covers D, and t is a newexample. Then US-L* eventually terminates, and outputs a closed and consistent table that coversD [ ftg.To prove the theorem we need the two following lemmas:7Tec

hnio

n -

Com

pute

r Sc

ienc

e D

epar

tmen

t - T

echn

ical

Rep

ort

CIS

9504

- 1

995

Page 8: Computer Science Technion - Computer Science Department ...€¦ · CIS Rep ort #9504 y Ma 1995 or F the elimination of y an doubt, it is y hereb stressed that the sta b memer and/or

Algorithm: US-L*Given:D: a set of past examples of the machine's input/output behavior.(S;E; T ): A closed and consistent observation table that covers D.M : The current model. M =M(S;E; T ).Repeatlet t be a new example of the machine's behaviorD D [ ftgif t is a supporting examplefor each table entry (s; e) that is covered by t, (e.g. t = se)mark (s; e) as a permanent entryif t is a counterexamplefor each pre�x p of tif p 62 S [ S�Add p into S and Extend(S;E; T )elselet e be a su�x of t such that t = peif (p; e) 2 Tmark (p; e) as permanent and change its value to D(t)8s 2 S [ S� such that row(s) = row(p)if (s; e) is a hole, (s; e) (and its tied entries) D(t)move p into S and Extend(S;E; T )mark all entries as not changed To prevent in�nite consistent loopWhile not Consistent(S;E; T)�nd two equal rows s1; s2 2 S, � 2 �in, e 2 E, such that T (s1�; e) 6= T (s2�; e)if both (s1�; e) and (s2�; e) are permanentor both have been changed before fwe must distinguish between rows s1 and s2gAdd �e into E and Extend(S;E; T )elseif one entry is not a hole that was not changed before (assume (s2�; e))or both entries are holes that was not changed before and assume s1 � s2T (s2�; e) (and its tied entries) T (s1�; e)mark (s2�; e) (and its tied entries) as changedWhile not Closed(S;E; T )�nd s 2 S� such that 8s0 2 S row(s0) 6= row(s)move s into S and Extend (S;E; T )M M(S;E; T )Until foreverFigure 1: US-L*: Unsupervised learning algorithm of a DFA8Tec

hnio

n -

Com

pute

r Sc

ienc

e D

epar

tmen

t - T

echn

ical

Rep

ort

CIS

9504

- 1

995

Page 9: Computer Science Technion - Computer Science Department ...€¦ · CIS Rep ort #9504 y Ma 1995 or F the elimination of y an doubt, it is y hereb stressed that the sta b memer and/or

Lemma 1 The closeness loop at the end of the algorithm, terminates after �nite number of itera-tions, does not change the consistency of the table, and outputs a closed table.Proof: At each iteration, S is extended with one row that is distinct from all other rows inS. E is not extended during the loop. The number of di�erent rows in S is �nite and bounded by�jEjout. Therefore, the number of iterations must also be �nite. Termination occurs only when thetable is closed so the loop is terminated and output a closed table. Consistency of the table is notchanged during the closeness loop because any row that is added to S is distinct from any otherrows in S. Therefore, inconsistency can not be added to the table. 2Lemma 2 The consistency loop is �nite and outputs a consistent table.Proof: Let's de�ne an inconsistent point as a tuple (s1; s2; �; e), where s1; s2 2 S and row(s1) =row(s2), � 2 �in ,e 2 E, and T (s1�; e) 6= T (s2�; e). At each iteration, at least one inconsistentpoint is eliminated, but others might be added to the table. The proof will show that this processmust be �nite.First, an extension of E might cause an addition of inconsistent points to the table. We showthat the number of extensions of E is �nite. At each iteration, E is extended with at mostone element. Assume that �e is added into E to eliminate an inconsistent point (s1; s2; �; e),T (s1; �e) 6= T (s2; �e) and for any other e 2 E, T (s1; e) = T (s2; e). Therefore, the added column isdistinct from all other columns of T . S is not extended during the loop, so the number of di�erentcolumns is bounded by �jSjout and the number of extensions of E must also be �nite.Second, eliminating an inconsistent point may be done by changing hole values. This processmight cause consistent points to become inconsistent because changing a table entry causes all itstied entries to be changed either. To prevent an in�nite loop, any entry that is changed, is markedand can not be changed again. At each iteration at least one inconsistent point is eliminatedand any other point can become inconsistent only once, so, we can conclude that the loop mustterminates after �nite number of iterations, and outputs a consistent table. 2Now we can prove the correctness of the algorithm.Proof: (of theorem 4) If t is a supporting example the proof is trivial.If t is a counterexample, the algorithm extends (S;E; T ) to cover t. During table extension, anypermanent entry is �lled with the value of its supporting example, and any hole entry is �lled withthe current model. This �lling strategy assures that the extended table covers D [ ftg, and thelegality of the assignment to the holes. The two previous lemmas show that the consistency andthe closeness loops at the end of the algorithm, terminate and output a closed and consistent table,so the proof is complete. 23.3 Example Runs of the AlgorithmAssume that �in = fa; bg, �out = f0; 1g, and the learned machine has only one state q0. Figure 2describes the learned machine, and �gure 3 describes a learning session of the algorithm.The initial model has one state and it predicts the opposite from the machine. Following the �rstexample, (�; 0), the algorithm changes the table entry (�; �) to zero and marks it as a permanententry. Other entries of the table are also changed by the algorithm to preserve equal rows in orderto prevent extension of the model. The accepted model, M1, is equivalent to the learned machineand any following example must be a supporting example and does not change the model. This9Tec

hnio

n -

Com

pute

r Sc

ienc

e D

epar

tmen

t - T

echn

ical

Rep

ort

CIS

9504

- 1

995

Page 10: Computer Science Technion - Computer Science Department ...€¦ · CIS Rep ort #9504 y Ma 1995 or F the elimination of y an doubt, it is y hereb stressed that the sta b memer and/or

a,b

0Figure 2: DFA0a,b

1

T0

a

b

1

1

1

a

a

b

0

0

0

T1

a,b

M0:

M1:

λ

λ 0

First Example: (l , 0)

λ

λ

Figure 3: A learning session of DFA0, holes are marked by squares. The accepted model is equivalentto the machine after the �rst exampleexample can point out the importance of the heuristics used by US-L*. If other entries of the tablewould have not been changed, the accepted model would have two states and any other examplewould have caused an addition of at least one more state to the model.Figure 4 shows another DFA and Figure 5 describes an example of a learning session of themachine.1 0

a

a

b bFigure 4: DFA1The �rst example (�; 1) is a supporting example and only changes the entry (�; �) to becomepermanent. The second example, (a; 0) is a counterexample, table entries are changed and the newmodel has two states. The third example, (ab; 0), is also a counter example. The algorithm adds abinto S and aba and abb into S�. An inconsistent point is added, (a; ab; b; �). In contradiction to theoriginal L�, by changing the entry T (abb; �) to be equal to T (ab; �), the table becomes consistentwithout further extension. As a result, the third model is equivalent to the learned machine.The examples, and the order they arrive, mostly e�ect the size of the learned model. Thecorrectness proof also shows that the observation table can grow to become exponential in sizeof the sample. Figure 6 shows the model constructed from DFA1, when an arbitrary stream ofexamples is provided to the algorithm. After four examples, (bbab,0), (abbab,1), (bababa,0),(aaabbb,0), the model has eight states and is still not equivalent to the learned machine.10Tec

hnio

n -

Com

pute

r Sc

ienc

e D

epar

tmen

t - T

echn

ical

Rep

ort

CIS

9504

- 1

995

Page 11: Computer Science Technion - Computer Science Department ...€¦ · CIS Rep ort #9504 y Ma 1995 or F the elimination of y an doubt, it is y hereb stressed that the sta b memer and/or

ab

T0 λλ 1

11 1

a,b

Μ0

ab

T2 λλ 1

1 0

a

1

b

a,b

Μ2

Example 2: (a,0)

0

aaab

11

1 0

a

a

b ba

b

T3 λλ 1

1

Example 3: (ab,0)

0

aa

aba11

ab 0

abb 0

Μ3

ab

T1 λλ 1

11 1

a,b

Μ1

Example 1: (1,1)λ

Figure 5: A learning session of DFA1. After three examples, the learned model is equivalent to themachine.1 0 1 0

1

0 0

1

a,b

a

b

b

a

a

ba

b

a,b

a,b

a,bFigure 6: The accepted model for DFA1 from an arbitrary sample (bbab,0), (abbab,1), (bababa,0),(aaabbb,0) 11Tec

hnio

n -

Com

pute

r Sc

ienc

e D

epar

tmen

t - T

echn

ical

Rep

ort

CIS

9504

- 1

995

Page 12: Computer Science Technion - Computer Science Department ...€¦ · CIS Rep ort #9504 y Ma 1995 or F the elimination of y an doubt, it is y hereb stressed that the sta b memer and/or

We hypothesize that the algorithm is best suited for pre�x-closed samples. We conducted anexperiment where random machines of various sizes were created and were modeled by US-L*,using various sizes of random pre�x-closed samples of their behavior. A random machine with agiven number of states was constructed by choosing a random transition function and by choosing arandom output function. 100 experiments were conducted for each pair of sample size and machinesize. It is important to clarify that the randomly built machines are not necessarily minimal.Figure 7 shows the average size of the learned models as a function of the sample size, and as afunction of the machines size. It is quiet clear that the average size of the learned models is similarto the size of the machines, and is not e�ected from the sample size. These results suggest thatUS-L* has a strong ability to explore the common pattern of the sample.Model Size vs. Sample size

Machine Size = 4

Machine Size = 5

Machine Size = 6

Machine Size = 7

Machine Size = 8

Model Size

Sample Size2.50

3.00

3.50

4.00

4.50

5.00

5.50

6.00

6.50

7.00

7.50

30.00 40.00 50.00 60.00 70.00 80.00 90.00Figure 7: Average model size accepted from a random pre�x-closed samples of various size, and asa function of the size of the learned machine4 DiscussionUnsupervised learning of �nite automata has been proven to be NP-hard. However, there are manyreal situations when we must perform such learning. The work presented here studies the possibilityof using heuristic methods for unsupervised learning of �nite automata.When a learning algorithm gets an example that is not consistent with its current model, itchanges the model to agree with the new example. In the case of learning a DFA, this step leadsto the creation of \holes" in the table representing the automata. Filling these holes in an optimalway is NP-hard. The L� algorithm overcomes the complexity problem by using a teacher for �llingthe holes. Rivest and Schapire's algorithm overcomes the complexity problem by experimentingwith the machine. The method presented in this paper is targeted at situations when a teacher isnot available and experimentation is not possible.At each learning step, our method �lls the holes by consulting the model that was generated12Tec

hnio

n -

Com

pute

r Sc

ienc

e D

epar

tmen

t - T

echn

ical

Rep

ort

CIS

9504

- 1

995

Page 13: Computer Science Technion - Computer Science Department ...€¦ · CIS Rep ort #9504 y Ma 1995 or F the elimination of y an doubt, it is y hereb stressed that the sta b memer and/or

in the previous step. The �lled holes have tentative status and the algorithm readily changes theirvalue if the changes may lead to a more compact model.We conducted a set of experiments where random automata were generated and the algorithmtried to learn them based on pre�x-closed samples of their behavior. The algorithm managed tolearn very compact models that agree with the samples. The size of the sample had very littlee�ect on the size of the model. Obviously our algorithm does not solve the complexity issue andit is always possible to build an adversary sample. However, the experimental results suggest thatthere might be classes of problems for which the algorithm behaves well.The work presented here is only a �rst step. We are currently working on the following issues:� Analyze the parameters that e�ect the algorithm's complexity.� Find conditions that identify classes under which the algorithm behaves well.� Identify alternative heuristics for �lling holes.� Testing alternative search methods in the space of assignments.References[Ang78] D. Angluin. On the complexity of minimum inference of regular sets. Information andControl 39, 337-350, 1978.[Ang87] D. Angluin. Learning regular sets from queries and counterexamples. Information andComputation 75, 87-106, 1987.[Gol72] E. M. Gold. System identi�cation via state characterization. Automatica 8, 621-636, 1972.[Gol78] E. M. Gold. Complexity of automaton identi�cation from given data. Information andControl 37, 302-320, 1978.[KV89] M. Kearns and L. G. Valiant. Criptographic limitations on learning boolean formulae and�nite automata. In Proceedings of the 21th ACM Symposium on Theory and Computing,pages 433{444, 1989.[Pit89] L. Pitt. Inductive inference, DFAs and computational complexity. In K. P. Jantke, editor,Analogical and Inductive Inference, Lecture Notes in AI 397, pages 18{44. Springer-Verlag,1989.[QR89] J. R. Quinlan and R. L. Rivest. Inferring decision tree using the minimum descriptionlength principle. Information and Computation 80,227-248, 1989.[Qui86] J. R. Quinlan. Induction of decision trees. Machine Learning 1, 81-106, 1986.[RS89] R. L. Rivest and R. E. Schapire. Inference of �nite automata using homing sequences. InProceedings of the 21th ACM Symposium on Theory and Computing, pages 411{420, 1989.13Tec

hnio

n -

Com

pute

r Sc

ienc

e D

epar

tmen

t - T

echn

ical

Rep

ort

CIS

9504

- 1

995

Page 14: Computer Science Technion - Computer Science Department ...€¦ · CIS Rep ort #9504 y Ma 1995 or F the elimination of y an doubt, it is y hereb stressed that the sta b memer and/or

Technion - Israel Institute of TechnologyComputer Science DepartmentLIST OF CIS REPORTSCIS 9001Guy Foux, Michael Heymann, and Alfred M. Bruckstein: \Two-Dimensional Polygon NavigationAmong Unknown Stationary Polygonal Obstacles".CIS 9002V.M. Markowitz and J.A. Makowsky: \Identifying Extended Entity-Relationship Object Structuresin Relational Schemas", May 1990.CIS 9003A.M. Bruckstein, N. Katzir, M. Lindenbaum, and M. Porat: \Similarity-Invariant Recognition ofPartially Occluded Planar Curves and Shapes", June 1990.CIS 9004G. Matsliach and O. Shmueli: \A General Method for Distributing Search Structures and ReducingMain Memory Consumption", June 1990.CIS 9005J.-C. Gregoire, J.A. Makowsky, and S. Sagiv: \The Expressive Power of Side E�ects in Prolog",June 1990.CIS 9006J. Ukelson and J.A. Makowsky: \Formal Interactive Menu Design", June 1990.CIS 9007Rina Dechter: \On the Design of Networks with Hidden Variables", July 1990.CIS 9008M. Lindenbaum and A.M. Bruckstein: \Blind Approximation of Planar Convex Sets" (preliminaryversion), September 1990.CIS 9009Y. Willner and M. Heymann: \On Supervisory Control of Concurrent Discrete-Event System",October 1990.CIS 9010Y. Brave and M. Heymann: \On Optimal Attraction in Discrete-Event Processes", October 1990.ANNUAL REPORT, 1990.CIS 9011A. M. Bruckstein: \Why the Ant Trails look so Straight and Nice", January, 1991.CIS 9012Y. Brave and M. Heymann: \Control of Discrete Event Systems Modeled as Hierarchical StateMachines", March 1991.CIS 9101E. Adin and A.M. Bruckstein: \Navigation in a Dynamic Environment", March 1991.CIS 9102J.-Ch. Gregoire, J.A. Makowsky, S.E. Levin: \Programming Reactive Systems with Statecharts",April 1991.CIS 9103S. Wintner, U. Ornan: \Computational Models for Syntactic Analysis - Their Fitness for Writinga Computational Grammar for Hebrew", May 1991.CIS 9104Y. Baram, D. Sal'ee: \Memory Capacity of Ternary and Binary Hebbian Networks Storing SparseVectors", June 1991.Tec

hnio

n -

Com

pute

r Sc

ienc

e D

epar

tmen

t - T

echn

ical

Rep

ort

CIS

9504

- 1

995

Page 15: Computer Science Technion - Computer Science Department ...€¦ · CIS Rep ort #9504 y Ma 1995 or F the elimination of y an doubt, it is y hereb stressed that the sta b memer and/or

CIS 9105A.M. Bruckstein, N. Cohen, A. Efrat: \Ants, Crickets and Frogs in Cyclic Pursuit", July 1991.CIS 9106E. Shifroni and B. Shanon: \Flexible Interactive Guidance", September 1991.CIS 9107E. Shifroni and B. Shanon: \Interactive User Modeling", September 1991.CIS 9108D. Geiger and J. Pearl: \Logical and Algorithmic Properties of Conditional Independence andGraphical Models", October 1991.CIS 9109Y. Pnueli and A.M. Bruckstein: \Posterizing E�ects can be Modeled by Point Operations", Octo-ber 1991.CIS 9110G.M. Benedek and A. Itai: \Dominating Distributions and Learnability", October 1991.CIS 9111Y. Bargury and J. Makowsky: \The Expressive Power of Transitive Closure and 2-way MultiheadAutomata", October 1991.T

echn

ion

- C

ompu

ter

Scie

nce

Dep

artm

ent -

Tec

hnic

al R

epor

t C

IS95

04 -

199

5

Page 16: Computer Science Technion - Computer Science Department ...€¦ · CIS Rep ort #9504 y Ma 1995 or F the elimination of y an doubt, it is y hereb stressed that the sta b memer and/or

CIS 9201N. Peterfreund and Y. Baram: \Second-Order Bounds on the Domain of Attraction and the Rateof Convergence of Nonlinear Dynamical Systems and Neural Networks", January 1992.CIS 9202A.M. Bruckstein, G. Sapiro and D. Shaked: \A�ne-Invariant Evolutions of Planar Polygons", Jan-uary 1992.CIS 9202 (Revised, June 1992)CIS 9202 (2nd Revision, April 1993)CIS 9203C. Gotsman: \Halftoning of Image Sequences", February 1992.CIS 9204R. Kimmel, A.M. Bruckstein: \Shape O�sets via Level Sets", March 1992.CIS 9205Y. Willner, M. Heymann: \Language Convergence in Controlled Discrete-Event Systems", April1992.CIS 9206S. Markovitch, P.D. Scott: \Information Filtering: Selection Mechanisms in Learning Systems",April 1992.CIS 9207S. Markovitch and I. Rosdeutscher: \Systematic Experimentation with Deductive Learning: Satis-�cing vs. Optimizing Search", June 1992CIS 9208G. Sapiro, R. Kimmel, D. Shaked and A.M. Bruckstein: \Implementing Continuous-Scale Mor-phology", June 1992.CIS 9209R. Kimmel and A.M. Bruckstein: \Shape from Shading via Level Stes", June 1992.CIS 9210Y. Pnueli and A.M. Bruckstein: \DigiD�urer - A Digital Engraving System", June 1992.CIS 9211S. Markovitch: \Selective Utilization of Macro Operators: A Learning Program Solves the GeneralN �N Puzzle", June 1992.CIS 9212Y. Baram: \Corrective Memory by a Symmetric Sparsely Encoded Network", December 1992.

Tec

hnio

n -

Com

pute

r Sc

ienc

e D

epar

tmen

t - T

echn

ical

Rep

ort

CIS

9504

- 1

995

Page 17: Computer Science Technion - Computer Science Department ...€¦ · CIS Rep ort #9504 y Ma 1995 or F the elimination of y an doubt, it is y hereb stressed that the sta b memer and/or

CIS 9213A.M. Bruckstein and A. Tannenbaum: \Some Mathematical Problems in Computer Vision", July1992.CIS 9214M. Lindenbaum, M. Fischer and A. M. Bruckstein: \On Gabor's Contribution to Image Enhance-ment", July 1992.CIS 9215M. Lindenbaum: \Bounds on Shape Recognition Performance", September 1992.CIS 9216L. Finkelshtein and S. Markovitch: \Selective Acquisition and Utilization of Macro Operators: ALearning Program Solves the General N � N Puzzle", (Revised Version of CIS Report # 9211),September 1992.CIS 9217S. Markovitch and Y. Sella: \Learning of Resource Allocation Strategies for Game Playing", Novem-ber 1992.CIS 9218M.Z. Rupp, J.A. Makowsky and I Pomeranz: \Adequate Test Sets for Loop Testing", November1992.CIS 9219N. Peterfreund and Y. Baram: \Convergence Analysis of Nonlinear Dynamical Systems by NestedLyapunov Functions", November 1992.CIS 9220D. L. Ringach and Y. Baram: \Obstacle Detection by Di�usion", November 1992.CIS 9221C. Gotsman: \Optimal Constant-Time Filtering by Singular Value Decomposition", December1992.CIS 9222N. Peterfreund and Y. Baram: \Trajectory Control of Convergent Networks", December 1992.CIS 9301R. Kimmel, A. Amir and A.M. Bruckstein: \Finding Shortest Paths on Graph Surfaces", January1993.CIS 9302G. Sapiro, R. Kimmel, D. Shaked, B.B. Kimia and A.M. Bruckstein: \Implementing Continuous-Scale Morphology via Curve Evolution", January 1993.

Tec

hnio

n -

Com

pute

r Sc

ienc

e D

epar

tmen

t - T

echn

ical

Rep

ort

CIS

9504

- 1

995

Page 18: Computer Science Technion - Computer Science Department ...€¦ · CIS Rep ort #9504 y Ma 1995 or F the elimination of y an doubt, it is y hereb stressed that the sta b memer and/or

CIS 9303G. Sapiro, A.M. Bruckstein: \A B-Spline Based A�ne Invariant Multiscale Shape Representation",January 1993.CIS 9304G. Sapiro and A.M. Bruckstein: \The Ubiquitous Ellipse", January 1993.CIS 9305D. Carmel and S. Markovitch: \Learning Models of Opponent's Strategy in Game Playing", Jan-uary 1993.CIS 9306D. Geiger and D. Heckerman: \Dependence and Relevance: A Probabilistic View", February 1993.CIS 9307D. Geiger and D. Heckerman: \Knowledge Representation and Inference in Similarity Networksand Bayesian Multinets", February 1993.CIS 9308A. Efrat and C. Gotsman: \Subpixel Image Registration Using Circular Fiducials", February 1993.CIS 9309A. Efrat, M. Sharir and Alon Ziv: \ Computing the Smallest k-Enclosing Circle and Related Prob-lems", February 1993.CIS 9310M. Heymann and F. Lin: \On-Line Control of Partially Observed Discrete Event Systems", March1993.CIS 9311G. Matsliach and O. Shmueli: \Analyzing an E�cient Method for Distributing Search Structures",March 1993.CIS 9312G. Elber: \Accessibility in 5-Axis Milling Environment", March 1993.CIS 9313Y. Baram: \Pattern Correction and Association and Sequence Regeneration by Correlative Mem-ory", March 1993.CIS 9314R. Kimmel and A.M. Bruckstein: \Sub-Pixel Distance Maps and Weighted Distance Transforms",April 1993.CIS 9315R. Kimmel and G. Sapiro: \Shortening Three Dimensional Curves via Two Dimensional Flows",April 1993.

Tec

hnio

n -

Com

pute

r Sc

ienc

e D

epar

tmen

t - T

echn

ical

Rep

ort

CIS

9504

- 1

995

Page 19: Computer Science Technion - Computer Science Department ...€¦ · CIS Rep ort #9504 y Ma 1995 or F the elimination of y an doubt, it is y hereb stressed that the sta b memer and/or

CIS 9316A.M. Bruckstein: \On Image Extrapolation", April 1st, 1993.CIS 9317A.M. Cohen and J.A. Makowsky: \Two Approaches to Nonmonotonic Equality", April 1993.CIS 9215M. Lindenbaum: \Bounds on Shape Recognition Performance" (Revised Version of CIS Report #9215, September 1992), May 1993.CIS 9318D. Carmel and S Markovitch: \Learning Models of Opponent's Strategy in Game Playing", (Re-vised Version of CIS Report #9305), June 1993.CIS 9319R. Kimmel and A.M. Bruckstein: \Tracking Level Sets by Level Sets: A Method for Solving theShape from Shading Problem", May 1993.ACTIVITY REPORT 1991 - 1992.CIS 9320Y. Baram: \Classi�cation of Binary and Real Bounded Data by Learning Sparse Representations",August 1993.CIS 9321Y. Baram: \Obstacle Detection by Recognizing Binary Expansion Patterns", September 1993.CIS 9322M. Levinger, U. Ornan and A. Itai: \Morphological Disambiguation in Hebrew Using A PrioriProbabilities", September 1993.CIS 9323Y. Pnueli and A.M. Bruckstein: \Gridless Halftoning: A Reincarnation of the Old Method", Oc-tober 1993.CIS 9324G. Elber: \Symbolic and Numeric Computation in Curve Interrogation", October 1993.CIS 9325D. Shaked and A.M. Bruckstein: \On Symmetry Axes and Boundary Curves", November 1993.CIS 9326O. Sudarsky: \Exploiting Temporal Coherence in Animation Rendering: A Survey", November1993.CIS 9327R. Kimmel and A.M. Bruckstein: \Global Shape from Shading", November 1993.

Tec

hnio

n -

Com

pute

r Sc

ienc

e D

epar

tmen

t - T

echn

ical

Rep

ort

CIS

9504

- 1

995

Page 20: Computer Science Technion - Computer Science Department ...€¦ · CIS Rep ort #9504 y Ma 1995 or F the elimination of y an doubt, it is y hereb stressed that the sta b memer and/or

CIS 9328A.M. Bruckstein and D. Shaked: \On Projective Invariant Smoothing and Evolutions of PlanarCurves and Polygons", November 1993.CIS 9329M. Lindenbaum: \ On the Amount of Data Required for Reliable Recognition", November 1993.CIS 9330M. Lindenbaum and S. Ben-David: \ Applying VC-dimension Analysis to Object Recognition",November 1993.CIS 9331Y. Baram: \Classi�cation by Balanced, Linearly Separable Representation", January 1994.CIS 9332G. Elber: \Line Illustrations 2 Computer Graphics", January 1994.CIS 9401P. Golland and A.M. Bruckstein: \Why R.G.B.? or How to Design Color Displays for Martians",February 1994.CIS 9402D. Carmel and S. Markovitch: \The M� Algorithm: Incorporating Opponent Models into Adver-sary Search", March 1994.CIS 9403G. Elber: \Freeform Surface Region Optimization for Three- and Five-Axis Milling", March 1994.CIS 9404G. Elber and R. Fish: \5-Axis Freeform Surface Milling using Piecewise Ruled Surface Approxi-mation", March 1994.CIS 9405Y. Baram: \A Bird's Eye View on the Descent Trajectory", April 1994.CIS 9406I.A. Wagner and A.M. Bruckstein: \Row Straightening via Local Interactions", May 1994.CIS 9407G. Elber: \Global Error Bounds and Amelioration of Sweep Surfaces", May 1994.CIS 9408S. Ben-David and M. Lindenbaum: \Learning Distributions by their Density Levels - A Paradigmfor Learning without a Teacher" (extended abstract), September 1994.

Tec

hnio

n -

Com

pute

r Sc

ienc

e D

epar

tmen

t - T

echn

ical

Rep

ort

CIS

9504

- 1

995

Page 21: Computer Science Technion - Computer Science Department ...€¦ · CIS Rep ort #9504 y Ma 1995 or F the elimination of y an doubt, it is y hereb stressed that the sta b memer and/or

CIS 9409A. Eliaz and D. Geiger: \ Word-level Recognition of Small Sets of Hand-written Words", June1994.CIS 9410Y. Baram and S. Wasserkrug: \Clustering and Classi�cation by Neural Networks Applied to Dia-mond Grading", June 1994.CIS 9411I.A. Wagner and A.M. Bruckstein: \Probabilistic Pursuits on the Integer Grid", September 1994.CIS 9412D. Shaked and A.M. Bruckstein: \The Curve Axis", September 1994.CIS 9413Gershon Elber: \Proper Piecewise Linear Approximation of Freeform Surfaces", October 1994.CIS 9414G. Elber and E. Cohen: \Exact Computation of Gauss Maps and Visibility Sets for Freeform Sur-faces", October 1994.CIS 9415E. Goldstein and C. Gotsman: \Polygon Morphing Using a Multiresolution Representation", Oc-tober 1994.CIS 9416I. Notkin and C. Gotsman: \Parallel Adaptive Ray-Tracing", November 1994.CIS 9417Z. Duric, J. Fayman and E. Rivlin: \Function from Motion", November 1994.CIS 9418A. Amir and M. Lindenbaum: \The Construction and Analysis of a Generic Grouping Algorithm",December 1994.CIS 9419A.M. Bruckstein and D. Shaked: \Skew-Symmetry Detection via Invariant Signatures", December1994.CIS 9420Y. Baram and Z. Roth: \Multi-Dimensional Density Shaping by Sigmoidal Networks with Appli-cation to Classi�cation, Estimation and Forecasting", December 1994.CIS 9501G. Elber, J-J. Choi and M-S. Kim: \Ruled Tracing", January 1995.

Tec

hnio

n -

Com

pute

r Sc

ienc

e D

epar

tmen

t - T

echn

ical

Rep

ort

CIS

9504

- 1

995

Page 22: Computer Science Technion - Computer Science Department ...€¦ · CIS Rep ort #9504 y Ma 1995 or F the elimination of y an doubt, it is y hereb stressed that the sta b memer and/or

CIS 9502Editor Shaul Markovitch: \Proceedings of the 11th Israeli Symposium on Arti�cial Intelligence",January 1995.CIS 9503A.M. Bruckstein, E. Rivlin and I. Weiss: \Scale Space Local Invariants", February 1995.CIS 9505I. Kamon and E. Rivlin: \Sensory Based Motion Planning with Global Proofs", February 1995.CIS 9506D. Geiger and D. Heckerman: \A Characterization of the Dirichlet Distribution through Globaland Local Independence", February 1995.CIS 9411I.A. Wagner and A.M. Bruckstein: \Probabilistic Pursuits on the Integer Grid", Revised VersionFebruary 1995.CIS 9507Gershon Elber: \Gridless Halftoning of Freeform Surfaces via a Coverage of Isoparametric Curves",March 1995.CIS 9508A.M. Bruckstein, C.L. Mallows and I.A. Wagner: \Probabilistic Pursuits on the Grid", March1995.CIS 9509Gershon Elber: \Remarks on O�set Approximation Techniques of Parametric Curves", April 1995.CIS 9510J.A. Fayman, E. Rivlin and H.I. Christensen: \The Active Vision Shell", May 1995.CIS 9511D. Shaked and A.M. Bruckstein: \Pruning Medial Axes", May 1995.CIS 9504D. Carmel and S. Markovitch: \Unsupervised Learning of �nite Automata: A Practical Approach",May 1995.

Tec

hnio

n -

Com

pute

r Sc

ienc

e D

epar

tmen

t - T

echn

ical

Rep

ort

CIS

9504

- 1

995