文档库 最新最全的文档下载
当前位置:文档库 › SOFTWARE--PRACTICEANDEXPERIENCESoftw.Pract.Exper.20003061-79 Automatedtest-datagenerationfo

SOFTWARE--PRACTICEANDEXPERIENCESoftw.Pract.Exper.20003061-79 Automatedtest-datagenerationfo

SOFTWARE--PRACTICEANDEXPERIENCESoftw.Pract.Exper.20003061-79 Automatedtest-datagenerationfo
SOFTWARE--PRACTICEANDEXPERIENCESoftw.Pract.Exper.20003061-79 Automatedtest-datagenerationfo

SOFTWARE—PRACTICE AND EXPERIENCE

Softw.Pract.Exper.2000;30:61–79

SUMMARY

This paper presents a technique for automatically generating test-data to test exceptions.The approach is based on the application of a dynamic global optimization based search for the required test-data.The authors’work has focused on test-data generation for safety-critical systems.Such systems must be free from anomalous and uncontrolled behaviour.Typically,it is easier to prove the absence of any exceptions than proving that the exception handling is safe.A process for integrating automated testing with exception freeness proofs is presented as a way forward for tackling the special needs of safety critical systems.The results of a number of simple case-studies are presented and show the technique to be effective.The major result shows the application of the technique to a commercial aircraft engine controller system as part of a proof of exception freeness.This illustrates how automated testing can be effectively integrated into a formal safety-critical process to reduce costs and add value.Copyright?2000John Wiley&Sons,Ltd. KEY WORDS:test-data generation;veri?cation;exception conditions

INTRODUCTION

A failure occurs when software is prevented from performing its intended action.A subclass of failures, which are known as exceptions[1],may be due to erroneous inputs,hardware faults or logical errors in the software code.The exception handling code of a system is,in general,the least documented,tested and understood part,since exceptions are expected to occur only rarely[2].Indeed in a case-study by Toy[3]more than50%of the operational failures of a telephone switching system were due to faults in exception handling and recovery algorithms.In a more recent incident,the Ariane5launch vehicle was lost due to an unhandled exception destroying$400million of scienti?c payload[4].

CCC0038–0644/2000/010061–19$17.50Received21July1999 Copyright?2000John Wiley&Sons,Ltd.Revised17September and15October1999

Accepted16October1999

62

N.TRACEY ET AL.

Copyright?2000John Wiley&Sons,Ltd.Softw.Pract.Exper.2000;30:61–79

AUTOMATED TEST-DATA GENERATION FOR TESTING EXCEPTIONS63 Copyright?2000John Wiley&Sons,Ltd.Softw.Pract.Exper.2000;30:61–79

64

N.TRACEY ET AL.

Error–data going out of range.

Program

Error–running out of storage space.

Tasking

Error exceptions to be the focus of the work to date.The SPARK-Ada tool-set[31]mitigates against other pre-de?ned exceptions through language restrictions or static analysis.Tasking

Error is also unlikely to occur as dynamic memory allocation is not being used and therefore storage requirements can be calculated statically.The situations where Program

Error exceptions occur when a value goes out of range;examples include type bounds,array bounds,arithmetic under?ow/over?ow and attempts to divide by zero.The Ada compiler is responsible for inserting checks into the object-code to check for Constraint

Error is possible,data that causes the run-time violation.This can be thought of in

AUTOMATED TEST-DATA GENERATION FOR TESTING EXCEPTIONS65 Copyright?2000John Wiley&Sons,Ltd.Softw.Pract.Exper.2000;30:61–79

66

N.TRACEY ET AL.

Figure2.Genetic algorithm.

The next generation(i.e.new population)is then selected from these offspring and the old population.Again a simple survival of the?ttest tends to converge into suboptimal solutions.Therefore, some hybrid of survival of the?ttest and random selection is normally used.An outline of the genetic algorithm search process is shown in Figure2.

OPTIMIZATION BASED TEST-DATA GENERATION

To enable a dynamic search to be used to locate test-data it needs to be given some guidance.This guidance is given in the form of a?tness function.The amount of guidance given by the?tness function is one of the key elements in determining the effectiveness of the test-data generation.The other key factor is,of course,the search technique itself.The input domain of most programs,D,is likely to be very large.A?tness surface could be formed by applying the?tness function to every possible program input.It is this surface which is effectively being searched when attempting to generate test-data.The size and complexity of the search space limits the effectiveness of simple gradient descent or neighbourhood searches as they are likely to get stuck in locally optimal solutions[37],hence failing to?nd the desired test-data.It is for this reason that global heuristic optimization techniques are used. The?tness function must provide a measure of how close particular test-data is to executing the desired raise statement.The?tness function needs to return good values for test-data that nearly executes the raise statement and poorer values for test-data that is far from executing the raise statement.Since branch predicates determine the path followed they are vital in determining an effective?tness function.

Branch predicates consist of relational expressions connected with logical operators.The?tness function is designed such that it will evaluate to zero if the branch predicate evaluates to the desired condition and will be positive otherwise.This is important as it gives a highly ef?cient stopping

AUTOMATED TEST-DATA GENERATION FOR TESTING EXCEPTIONS67

Element Value

criterion for the search process.The?tness function is calculated as shown in Table I.In the table, K represents a failure constant which is added to further punish incorrect test-data.

In order to evaluate the?tness function it is necessary to execute an instrumented version of the SUT. There are two types of procedure call added by the instrumentation–branch evaluation and exception monitoring.Figure3(a)shows a simple program that can raise a user-de?ned exception and a number of Constraint

1 and Branch

Copyright?2000John Wiley&Sons,Ltd.Softw.Pract.Exper.2000;30:61–79

68

N.TRACEY ET AL.

1to Excep Copyright?2000John Wiley&Sons,Ltd.Softw.Pract.Exper.2000;30:61–79

AUTOMATED TEST-DATA GENERATION FOR TESTING EXCEPTIONS69

Error)in the program in Figure3(a).The following illustrates how the?tness function works.For the example the population size is set at?ve, Small

X Fitness

14Fails at Branch

232+K6

3?1Fails at Branch

145+K6

52Fails at Branch

Ave.Fitness

X Fitness

6?1Crossover1,3

7?5Crossover3,4,

Mutate X

8?5Crossover2,4

90Crossover2,5,

Mutate Y

103Crossover2,5

Mutate Y

24

The next generation is then selected,using a hybrid of elite survival and random selection.This new population is as follows.

No.Y Description

30+K

21+K

20+K

145+K

?33+K

Ave.Fitness

Copyright?2000John Wiley&Sons,Ltd.Softw.Pract.Exper.2000;30:61–79

70

N.TRACEY ET AL.

X Fitness

163Crossover11,13

17?1Crossover12,15

183Crossover13,14

192Crossover11,15,

Mutate X

20?1Crossover13,15

7.8

At this point the search can stop as solutions17and20both have a?tness of zero and hence represent test-data that will raise the desired exception.Figure4shows the?tness surface for this divide-by-zero exception.There are121possible test inputs to this simple program,of these it can be seen only6raise the divide-by-zero exception–(0,1),(1,0),(2,?1),(3,?2),(4,?3)and(5,?4).

As can be seen from this example the?tness function gives a quantitative measure of the suitability of the generated test-data for the purpose of raising a speci?ed exception in the SUT.The global

AUTOMATED TEST-DATA GENERATION FOR TESTING EXCEPTIONS71

H1:true.

H2:b>=index__first.

H3:b<=index__last.

->

C1:b>=integer__first.

C2:b<=integer__last.

Copyright?2000John Wiley&Sons,Ltd.Softw.Pract.Exper.2000;30:61–79

72

N.TRACEY ET AL.

Saturation does not give mathematically accurate results.Where over?ow would occur the result saturates at the ceiling of the range and similarly for under?ow.For divide-by-zero the saturating operation returns the largest positive or negative value in the range depending on the dividend.

AUTOMATED TEST-DATA GENERATION FOR TESTING EXCEPTIONS73

SUT Name INP NE EF C T

Total2423

Error s need to be considered. Figure6shows an example routine from the engine controller code.

Copyright?2000John Wiley&Sons,Ltd.Softw.Pract.Exper.2000;30:61–79

74

N.TRACEY ET AL.

Copyright?2000John Wiley&Sons,Ltd.Softw.Pract.Exper.2000;30:61–79

AUTOMATED TEST-DATA GENERATION FOR TESTING EXCEPTIONS75 H1:currentval>=-250000.

H2:currentval<=250000.

H3:goodval>=-250000.

H4:goodval<=250000.

->

C1:currentval-goodval>=-250000.

C2:currentval-goodval<=250000.

Figure7.Veri?cation conditions for smooth signal.

the exception would be raised.Where the test-data generation was successful the test-data illustrated a condition under which the run-time rules of the Ada language would be violated and hence an exception raised.The application of the test-data generation technique requires the execution of an instrumented version of the software as discussed earlier.The validity of testing the instrumented code is problematic for safety-critical systems.However,we view the test-data generation technique as a method of simply obtaining test-data.The development team should then use this test-data within their certi?ed development environment to validate it.Figure7shows the remaining two veri?cation conditions for the smooth signal subprogram(the irrelevant hypotheses have been removed).

Test-data was generated for each,illustrating that an exception could be raised.For example count input value of100when either(CurrentVal?GoodVal)>SmoothThresh or(GoodVal?CurrentVal)> SmoothThresh is true will cause an exception.

As already stated for the?nal engine controller system the run-time checks are turned off.This means that exceptions would not be raised,but rather that data values would become invalid.This could have serious safety implications for the system as the engine control-laws may not be stable with invalid data.A detailed investigation into these situations showed that violation of the run-time rules(and hence potentially invalid data)was not possible in the current system.The use of protected arithmetic operators which are well-de?ned in the presence of divide-by-zero,over?ow and under?ow prevents a large number of these cases.However,in these cases the resulting test-data is still interesting because the arithmetic operators return a mathematically incorrect result.In general it is important to know the situations when this can happen.The physical value ranges of sensor readings also prevented a number of exception conditions occurring in practice.This can be seen in the smooth signal example, the sensors reading currentval and goodval can only give values such that currentval?goodval is always in range.The over?ow of integer counters was another potential cause of exceptions.On closer inspection none of these over?ow conditions could arise.Typically the counters would be reset after a number of iterations according to?xed global data stored as part of the engine con?guration.In

76

N.TRACEY ET AL.

Copyright?2000John Wiley&Sons,Ltd.Softw.Pract.Exper.2000;30:61–79

AUTOMATED TEST-DATA GENERATION FOR TESTING EXCEPTIONS77 Copyright?2000John Wiley&Sons,Ltd.Softw.Pract.Exper.2000;30:61–79

78

N.TRACEY ET AL.

Copyright?2000John Wiley&Sons,Ltd.Softw.Pract.Exper.2000;30:61–79

AUTOMATED TEST-DATA GENERATION FOR TESTING EXCEPTIONS79 Copyright?2000John Wiley&Sons,Ltd.Softw.Pract.Exper.2000;30:61–79

相关文档