5

Click here to load reader

[IEE IEE Colloquium on `Computer Based Learning in Electronic Education' - London, UK (10 May 1995)] IEE Colloquium on `Computer Based Learning in Electronic Education' - Automated

  • Upload
    a

  • View
    212

  • Download
    0

Embed Size (px)

Citation preview

Page 1: [IEE IEE Colloquium on `Computer Based Learning in Electronic Education' - London, UK (10 May 1995)] IEE Colloquium on `Computer Based Learning in Electronic Education' - Automated

AUTOMATED ANALYSIS OF STUDENT SOLUTIONS TO DESIGN PROBLEMS

Dr Andy Carpenter, Miss Hilary Kahn and Mr Mark Langham

The Teaching and Learning Technology Programme (TLTp) Electronic Design Education Consortium @DE0 project is developing computer based leaming (CBL) material to assist in the teaching of electronics to EkXmnic Engineering and Computer Engineering undergraduate and taught postgraduate courses. This material is divided into four themes one of which concentrates on the teaching of System and High Level De- sign.

The abstract nature of the concepts involved in design means that the teaching of design to students is traditionally a difficult task. Experience has shown that the most effective way to perform this is to set the students design problems which they must solve using the tools available. The students' lack of experience means that they will often make mistakes which will require demonstrator assistance to ensure that the exercise can be completed within an acceptable period of time.

The CBL material being developed by the System and High Level Design theme of the EDEC project aims to replicate the standard leaming environment while reducing the demands made on demonstrators. TO support the expression of abstract designs the material is based around the use of VHDL and VHDL tools. The particular environment being developed aims to give students access to the standard VHDL tools that they would normally use, supplemented by some automated analysis of their design. This has two main attractions:

1. the use of standard tools means that the use of CBL does not limit the way in which a student can express a design

2. the load on demonstrators identifying simple problems is reduced.

Some demonstrator effort will still be required as the freedom to use real tools allows students to make complex mistakes which can not be automatically identified.

Approach

The technique used by the material is to set the student a relatively simple design problem and provide a template solution. The student's full solution is simulated with a standard test bench. The results of this simulation are then presented as a series of waveforms with differences between the student's results and the expected results highlighted as shown in Figure 1.

To assist a student unable to determine the cause of differences an analysis tool is provided. This compares the student's results with results from solutions with known errors to intemally identify the cause(s) of any dif- ferences from the expected results. To encourage the student to take an active part in debugging the design, the tool can only be activated once per design cycle (edit-compile-simulate loop). The tool does not immediately report the exact causes of errors to the student but gives general indications of problem areas which become more specific for each application of the tool. Figure 2 shows the results of running the tool multiple times on a circuit containing errors. In this case, the circuit description was not changed between runs.

Waveform Display Feedback

The waveform display, where differences between the student's results and the expected results are show, is the least sophisticated of the two feedback methods used within the courseware. To generate this display, it is necessary for the courseware to know what the outputs from a correct circuit are and when the results from the student's solution differ from them. This is achieved by simulating a correct circuit with a standard testbench and then using this testbench to simulate the student's solutions.

' D e p m e n t of Computer Science. University of Manchester. Oxford Road, Manchester M13 9PL

Page 2: [IEE IEE Colloquium on `Computer Based Learning in Electronic Education' - London, UK (10 May 1995)] IEE Colloquium on `Computer Based Learning in Electronic Education' - Automated

Figure 1 : Highlighting Differences Between A Student’s Results and Expected Ones

19 Ooh of the errors obtamcd were un&agnosablc

csults for run 2 of andysa.

92.3% match for ..- 75 0% match for --. Set funchon problem

40 0% match for _ _ _ Reset output problem

31 6% match for ... the clock has been omttcd

19 0% of t h e errors obtamcd were unhagnosablc

Thcrc IS a delay problem on data

csdts for run 3 of analys1s

92.3% match for ... There IS no dela) on data

Figure 2: Results from hlultiplc Runs of Analysis Tool

31 ’2

Page 3: [IEE IEE Colloquium on `Computer Based Learning in Electronic Education' - London, UK (10 May 1995)] IEE Colloquium on `Computer Based Learning in Electronic Education' - Automated

To maximise the number of VHDL environments with which the courseware can be used, the preferred method for retuming the results from the simulation environment to the courseware is via Text10 statements within the testbench. These write the value of all ports of the circuit to a text file each time the value of one of the ports changes. The courseware then processes these text files to produce an intemal list of values for each port. Each element of this list is a triple of srartTime, endTime and value.

The intemal list of values is produced for each port of the correct results and the student’s results. For each port, the two lists are compared to produce two more lists each element of which has the form:

startTim, eruiTime. value, state

where state is either correct or error. It is then a simple process for the display program to use these lists to generate the required waveform displays.

Design Analysis Feedback

Immediately informing a student of the precise cause of any errors identified in adesigndoes notencourage full participation in the analysis and debugging process. Hence, the analysis tool allows precise errors to be grouped together into categories which themselves may be further grouped together into larger categories. This grouping of categories can be defined to any depth. Associated with each precise error or category of errors is a message that is displayed to the student when the error or an error in the category is detected.

The first time that the analysis tool is used in a session it will identify the precise causes of errors but display only the message associated with the highest level of category to which the error belongs. The next time that the analysis tool is run it will display the message associated with the second level of error category. The message will become more specific each time that the tool is run until the message associated with the precise error is displayed.

Figure 3 shows the configuration file on which the analysis results shown in Figure 2 are based. This uses a line based format. Lines starting with Error give a precise message associated with an explicitly detected error. Categories of errors are started by ether the keyword Exclusive or the keyword N o n e x c l u s i v e . The errors in an exclusive category can not be present at the same time, i.e. at most one of the errors can be present. For example, the delay problem category, line 15, contains exclusive errors because a signal can not have both a missing delay and a delay of 5ns at the same time. Non-exclusive categories contain sets of errors from which more than one error can be present. Identifying exclusive and non-exclusive categories is used to improve the quality of the analysis performed.

In Figure 2 the first message for run 2 of the analysis tool comes from the second level category defined at line 17 in the configuration file. For the next run of the tool the message associated with the precise error defined on line 19 is displayed. In contrast the precise error fora missing clock, defined on line 28, is contained within one less category level. Hence, for run 2 the precise message is output.

Analysis Details

The analysis tool displays details about potential causes of differences between the student’s results and the expected results. Although the tool does not immediately inform the student of the precise cause of any differences, intemally the precise cause it always identified.

The method used is to produce a series of model answers, each of which contain just one known error. These answers are then simulated using the standard testbench and the intemal waveform data structures de- scribed above generated for each port. The elements of these lists which indicate incorrect values are then extracted to produce a profile for each error.

The analysis of a student’sanswer is performed by analysing the results of simulating the design to produce a profile for the experimental answer. This is then compared with the profiles for the knownemrs to determine the percentage match for each error; that is the percentage of elements of a known error profile which are contained in the student’s result profile.

Figure 4 shows the profiles for four known problems and those for a student’s answer. Each element of a profile is contained within parentheses and the complete profile list within braces. A profile element contains four values: nodeInError, startTime, enflime and incorrectValue. As only thc first element of the profile for

3 / 3

Page 4: [IEE IEE Colloquium on `Computer Based Learning in Electronic Education' - London, UK (10 May 1995)] IEE Colloquium on `Computer Based Learning in Electronic Education' - Automated

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 3.8 19 20 21 22 23 24 25 26 2 1 2 8 29 30 31 32 3 3 34 35 36 3 1 38 3 9

Exclusive: reset problem begin Error: reset is omitted Nonexclusive: reset output problem

begin

Nonexclusive: reset function problem . . .

begin Error: reset is synchronous

end . . .

end end

Nonexclusive: data problem begin Exclusive: delay problem on data

begin Error: no delay on data Error: 5ns delay on data end

. . . end

Exclusive clock problem begin

Error: clock omitted end

. . .

Exclusive: There is a problem with set begin

Nonexclusive: set function problem . . .

begin Error: set is asynchronous Error: There is no set, or set is forcing a 0 instead of a 1 end

end

Figure 3: Analysis Message File

Problem2 is contained in the student’s answer profile, there is a 50% match to this known error. The percent- ages match to the other known errors are: Problem1 0%. 50%, Problem3 100% and Problem-4 33%.

The student’s profile can contain elements not contained in any known profile, e.g. the last in the figure. These may be caused by unexpected errors in the design or interference between multiple errors. Experience has shown that once a testbench has been developed which reasonably isolates the effects of individual errors, the analysis tool can almost always detect individual errors in a design containing multiple errors.

To allow a student to concentrate on particular e m r messages, at most four messages are displayed. The errors selected are those with the highest percentage match.

3 / 4

Page 5: [IEE IEE Colloquium on `Computer Based Learning in Electronic Education' - London, UK (10 May 1995)] IEE Colloquium on `Computer Based Learning in Electronic Education' - Automated

Problem-1 = { ( Y, 20,25, 1 1, ( Y, 30,35, 1 ) } Problem2 = { ( Y, 22,25,1 1, ( Q, 40,65,0 ) } Problem3 = { ( Q, 12.23.0 ) } Problem-4 = { ( NQ, 32,36,1 ), ( Q, 46,48,1 ), ( Y, 64,87,1 ) }

Figure 4: Analysis Profiles

Conclusions

Analysis tests for problems requiring students to design a 4-input multiplexor, D-type flip-flop with asyn- chronous reset and synchronous set, a simplistic car park controller needing concurrency, and finite state ma- chine have been developed. These tests can successfully analyse solutions which contain multipleemrs. They can detect both functional and delay problems. As thenumber of delay problems is potentially infinite, the only delay related tests used in practice are for the omission of a required delay.

As with all digital testing, it has been found that the difficulty lies in developing the testbench. Once a test- bench has been developed which isolates as far as possible the effects of individual emrs , circuits containing multiple errors can be analysed with a high degree of confidence in the results. The most difficult circuit proved to be the D-type flop-flop where the effects of all of the errors are only visible through the single output.

The work has only just reached the point where it can be placed in front of students. The authors are confi- dent that it will assist students to leam how to capture hardware concepts in a Hardware DescriptionLanguage (HDL) and thus, progress towards the point where they can debug many of the errors within their own designs for themselves. The test cases developed have been carefully selected to cover the major elements from which the majority of designs are formed.

G 1995 The Instttution of Electncal Engineers Printed and published by the IEE. Savoy Place, London WCPR OBL. UK.

3 / 5