Friday, June 11, 2010

Category-partition Testing

Testing, Verification & Validation
X Robot families

INF4290
Semester Project Report
Title : Testing of a X family Robots
Akbar Faghihi Moghaddam - (Shahab)


TABLE OF CONTENTS
  1. INTRODUCTION
  2. BACKGROUND
  3. DESIGN OF CASE STUDY
    1. The SUT
    2. The System Specification
    3. Testing Phase
      1. Enable / Disable Motor
      2. Restart
      3. Rotation of the robot arm
        1. Base Choice(BC)
        2. Each Choice(EC)
        3. All combinations (AC)
  4. ANALYSIS OF THE RESULT
  5. LESSONS LEARNED AND OPEN ISSUES
  6. REFERENCES

INTRODUCTION
This is my semester project in testing and verification in INF4290 course(Testing, Verification & Validation) at institute for informatics at university of Oslo. The semester project is supposed to apply some of the testing techniques to actual software and quantitatively compare techniques,assess drawbacks and advantages. My project is testing a system which is a robotic package solution, used in different projects here at institute of Informatics(IFI). The System Under Test(SUT) is a robotic solution of X family robots developed here at Robotics and Intelligent Systems(ROBIN) group. These solutions are designed to be used as the foundation of master projects and other educational purposes(INF3480, Introduction to Robotics). Three generations of X family robotic solutions has been developed by Mats Høvin up to now. You can see the X2 version of this family on the right side of the page. The X3 version is much smaller, with some design improvements, as the new 3D printer here at Robotic group allows us to print our designs in even smaller scales. The main goal of all of these designs are to rotate the jointed arms to reach to final point at the robots workspace.

My interest in this particular project is because of its involvement in my own master thesis. My master project is a climbing robot, which will benefit from four X3 modules as the its arms. The reason of choosing this specific system as my SUT was to deeply understand the functionality and the way it is developed through testing it. Clearly another goal of this testing project is to find the errors, document and map them and finally debug it later, as re-developing this system is a part of my master thesis. It should be mentioned here that testing this system has given me a deep understanding of almost all the advantages and disadvantages of the SUT and has made me the right person to re-develop an error-free version of it in future. I also have plans of improving my own skills in testing and maybe having an improved version of this report as a part of my thesis. One of my goals here in this project was to have a deep understanding of the code and I would reach that by using White Box testing. This was not possible as is explained in detail later in Background part of this report.

Here in this report I will be describing the challenges I have been facing writing this report. Challenges which made me at the end to set a side some of the choices I had and choose Black Box testing, category-partition as the right choice for this paper. The report is structured in a way to give a good description of the system itself, methods and techniques in details with test cases used to perform the functional testing and finally the results and analysis of the result which I reached through this test.


BACKGROUND

To my knowledge no one has ever written a report on testing the X family robots and this report is not only the first paper written about this product family, but generally testing as a separate work is not so usual at the Robotic group here at IFI. I hope this paper can be the beginning of new point of view to robotic development here at ROBIN group. The decision to choose the right method for testing a robotic project was not easy at all. I have used several hours trying to find my way through lots of different choices I had in choosing a suitable SUT and the right method and tools for it(as the choices are simply too many for an unexperienced tester like me). For testing the functionality of my chosen SUT, I finally came up with category-partition technique of Black Box testing. The reader might ask why not the others ? For example why not White Box testing which would give me more understanding of the code behavior as it is one of my intension choosing this SUT ? Why not model checking which is a suitable system for hardware testing ? Or simply why not Random Testing ?

  • White Box Testing ? Not a good idea in this case ...
    As I will describe in the part "Design of the case study" the programing language used at the SUT, is not a standard popular language. It is a kind of semi-C++ language developed for programing Micro controllers. After looking around for a while, I found out there is absolutely no tools that I can benefit automating test cases(e.g. JUnit for Java) and no extra gadget to give a coverage rate of the program traced by test cases(e.g. EclEmma plugin for Eclipse, again for Java). I was also thinking about making manual test cases that I could run on the code and developing a small program that can give an estimated coverage(like EclEmma) of the code traversed by the test cases. This was not possible because of lack of time. All difficulties running White Box testing, made me to look away from it.

  • Model Checking ? Perfect but not now ...
    A perfect way to describe and test the hardware design. The problem rises when in our case some of the system specification should change to fit Model Checking needs for Boolean inputs. This method can be one of my future report candidates.

  • Black Box Testing ? The one I was looking for ...
    I should underline that one of the reasons in choosing Black Box Testing method, is my lack of experience in testing which is limited to the lectures at this course. Black Box testing methods seems to be easier in concept than other methods that can get too complicated from time to time. The B.B. focuses on the functionality without really realizing all the technical parts, which is perfect for me as a person who is new to this concept(Micro controller programming). 


  • Random Testing ? Could also be used, but was set a side because of lack of time to perform multiple tests ...

I have chosen B.B. testing, now what ? The definition says that goal of functional testing here is to find discrepancies between the actual behavior of the implemented system’s function and the desired behavior as described in the system’s functional specification. Black-box method, treats the system as a "black-box". It designs test cases based on inputs, outputs and general functionalities as defined in requirements specification and does not care about object's internal structure. The method is just interested to determine the correct or incorrect output. This means that with choosing this method, I am not going to reach the goal of understanding the code of the SUT by testing it, but I will be testing the functionality of the system and how well it can do what it is supposed to do(system specification). Usually the functional testing can be derived from three sources which are the system specification, design information and the code itself. Although we have access to the code here, but we are going to act like that we do not have a clue about the internal structure(B.B. testing) of the program and we are just after checking the functionality of this system according to test cases designed based on system specification.
Before choosing category-partition in B.B. testing, I almost did half of the report based on Equivalence Class Testing, which was not a success. The problem lead from dividing system specification into equivalent classes that could naturally not be exclusive. Therefor with help from the lecture, I have chosen category-partition which extends and combine Equivalence Class Testing, boundary value analysis. At this technique the system is divided into individual “functions” that can be independently tested. The method identifies the parameters of each “function” and, for each parameter, identifies distinct categories.  The categories are further subdivided into choices in the same way as equivalence partitioning is applied. This method emphasizes both the specification coverage and the error detection aspects of testing.

To do this we need to first have a clear idea about how the program functions and what is our System Specification and requirements to be able to decompose it into small test cases for category-partition. The SUT in this report is first divided into different functions, later categories and finally choices which are all described in details in "Design Of The Case Study".

DESIGN OF CASE STUDY

In this part I am going to describe how the SUT functions in the first place and later I'll try to give a clear picture of the system specification. Afterward I am going to go through the details of how did I actually divided and conquered the actual problems I faced while using category-partition testing technique. Where I have divided the program into smaller functions and then functions to categories and choices. I will also explain in details the test cases and test tables with expected results and the actual results at the end of this section. The detailed presentation analysis of the test cases results  will be presented in the next part which is called "Analysis Of The Result".

The SUT

All the X family robots are made of three different part :


    • The controller program
    • The Electronics solutions 
    • The Design


    I'll try to give a brief description of each part here :
    Controller Program
    The control program is a messy code which is an alpha version of the controller project and is under development right now as being used in different projects at ROBIN group. The code  is consist of 375 lines of a wiring language, which is a semi-C++ family. As most of controllers do, the program is in charge of reading and writing data to the pins on the micro controller board and its main goal is to rotate the arm with to a pre-defined angleThe main area of my focus in this report is the controller program. 
    Electronics solutions 
    The main part of electronic solution behind this module is an
    Arduino card. An Arduino is a single-board micro controller and a software suite for programming it. The hardware consists of a simple open hardware design for the controller with an Atmel AVR processor and on-board I/O support. The software(the language the controller is developed in) consists of a standard programming language and the boot loader that runs on the board.  The card is connected to the desired DC motor through one motor driver and an encoder which measures the angle traced by the motor shaft.
    Design
    The design is by Mats Høvin, and is implemented in
    SolidWorks 2009. The design later was printed out in special plastic materials by 3D printer at Robin labs. One of the goals of the test can be to check if the design have anything to do with system failures, but this was set a side because of lack of time.
    Now that we have a clear picture of how the system is made and what is the main goal, maybe it is now time to picture the system specification.

    The System Specification :

    The controller program does its job with the help of some inputs. It is supposed to receive a set of data and after adjusting itself to those received data, it should rotate to the pre-defined angle. This is the main goal of this controller program. These inputs are angle, proportional constant, choosing the verboose mode, restarting on demand and the ability to turn the motor driver ON and OFF while the system is still functional and the program controller is not terminated yet. After running the controller code, it prints the following welcome message on start-up :

    Welcome to the X2 controller.
    Variables :
    a = targetAngle, 16 bit signed integers, p = proportional constant, 8bit unsigned Byte
    Loop error is divided by p
    Commands :
    V - Verboose ON / Very verboose ON (ASCII modes)
    v - Verboose OFF (binary modes)
    m - Toggle motor enable
    r - Restarts the system
    a1234 - sets target angle(-360 - +360)
    p123 - sets prop constant p(1-255)
    Verboose ON

    Without having any previous knowledge of the code, the welcome message gives us the following information about the inputs. Now we know that it is giving us the following choices :

    • "a" which stands for angle :
      According to welcome message, angle is defined as a
      16 bit signed integer(-32767 - +32766), but limited to -360 to +360 in the program specification
    • "p" which stands for proportional constant :
      According to welcome message, p is defined as a
      8 bit unsigned Byte(0-255), but limited to 1 to 255 in the program specification
    • "V/v" for Verboose mode which makes the system to operate in verboose mode, meaning it will print lots of feedback from the system rotating.
    • "m" which turns the motor driver ON and OFF.
    • "r" which restart the whole system.

    The program after receiving inputs is ready to start the rotation. The angle and proportional constant values would be processed first and then a signal would be sent to the Arduino micro-controller. The micro controller sends the new commands to the motor driver and the driver would steer the motor according to the received input. Finally the motor would rotate to the desired angle, but this is done under
    special conditions, which means the angle should be given in the range(between -360 and +360) and the proportional constant should be given according to the type of motor which is used. So here is our clear system specification :
    The program receives the angle(16 bit signed integer) and proportional constant(8 bit unsigned Byte). After receiving these two inputs, it is ready to start rotating. The rotation can happen in two different modes(Verboose/NON verboose mode). On demand the user can choose to turn OFF an ON the motor driver without terminating the program and there is also a function available for Restarting the system. 
    I should mention that in this report I decided to set aside the Verboose functionality as the only thing it does is to print some feedback information from the motor driver.

    Testing Phase

    To reach the goal of our functional testing, we should test all the system functions and tests should be designed in a way to maximize the chance of finding errors in the software. As category-partition does, in this phase I have both covered the specification coverage and error detection aspect of testing. Down here I am trying to decompose the system specification into functional units which can be tested independently. The controller program can do the following tasks which here at category-partition technique we call them functions. 
    The Program Functions are :
    1. Enable/Disable Motor
    2. Restart
    3. Rotation of the robot arm

    Enable / Disable Motor

    Enable / Disable motor function is performed with entering "m" as input in the program. This will enable or disable motor driver and it is guessed that we are facing a Boolean type variable(True or False). The motor driver is turned ON with entering m for the odd(1,3,5,...) number of timers, while it will be turned OFF with entering m it even(2,4,6,...) number of times. In this function we have just one parameter and the parameter is a Boolean value. There are two categories for this parameter :

    • Motor driver in disabled mode
    • Motor driver in enabled mode
    Our only choice for each of these two categories is to check and see if the system performs as it should and there is no malfunction. The expected behavior is to see that the motor does not respond as it is in the disabled mode and respond to our orders as soon as it enters the enabled mode. We can generalize this with saying that if the m is pressed 

    CategoryChoices
    Motor driver in enabled mode
    m
    (is entered for the
    2*N+1(odd) time)
    Motor driver in disabled mode
    m
    (is entered for the
    2*N(even) time)


    Formal Test Specification :
    1. m is pressed for the (2*N)+1 time
    2. m is pressed for the (2*N) time

    #Test CaseDetailsExpected ResultResult
    1m1m is pressed for odd number of timesThe motor is turned ON
    The motor is turned ON
    2m2
    m is pressed for even number of times
    The motor is turned OFF
    The motor is turned OFF

    Restart

    The restart function is performed with entering "r" and according to program help, it should restart all the inputs and set the current degree to zero. The implementation of this function with one Boolean parameter leaves us again with two types of categories. The one where "r" is entered and the one where "r" is not entered :
    • "r" is entered and the system has restarted
    • "r" is not entered and the system info should not be restarted
    The only choice for each category is to check if the current angle is actually set to 0 after restarting.

    CategoryChoices
    System is Restarted
    "r" is entered
    System is not Restarted
    "r" is NOT entered

    Formal Test Specification :
    1. r is pressed
    2. r is NOT pressed

    #Test CaseDetailsExpected ResultResult
    1r1r is pressedThe system restarts and the current angle is equal with 0
    The system restarts and the current angle is equal with 0
    2r2
    r is not pressed
    The system continue to work as expected in system specification with its previous current angle
    The system continue to work as expected in system specification with its previous current angle

    Rotation of the robot arm

    The goal of this function which is the main goal of the whole system, is the most important function. This implementation allows the program to move(rotate) the robot arm to a pre-defined angle(input a). The function should be able to rotate the robot arm to the desired angle, with desired proportional constant. There are two parameters for this function, which are :

    1. Angle (Entered as a#)
    2. Proportional constant (Entered as p#)





    The categories for the first parameter(angle) based on the format of input, our information from the system specification and the system environment can be divided into following categories :

    • Inside the range ( -360 to +360 )
    • Out of the range
    • Less than 4 digit input
    • Invalid input
    The choices we would have for the mentioned categories of angle parameter are as followed :
    CategoryChoices
    Inside the range ( -360 to +360 )-360, [-359 - +359], +360
    Out of the range-361, +361
    Less than 4 digit inputOne digit, Two digits, Three digits
    (16 bit signed integer permits 5 digits, but here we do not include five digits as the test case wont be exclusive  anymore-
    intersection with "Out of range" category)
    Invalid inputa(NULL) , a_, a1z2b, ...
    The categories for the second parameter(proportional constant) again based on the format of input, our information from the system specification and the system environment can be divided into following categories :

    • Inside the range ( 1-255 )
    • Out of the range
    • Less than 3 digit input
    • Invalid input
    The choices we would have for the mentioned categories of angle parameter are as followed :
    CategoryChoices
    Inside the range ( 1-255 )+1, [+2 - +255), +255
    Out of the range0
    (8 bit unsigned Byte which makes 0 - 255 our range, but the last element 255 is not out of range, so we do not have upper out of range)
    Less than 3 digit inputOne digit, Two digits
    (8 bit unsigned Byte does not permit more than 3 digits)
    Invalid inputp(NULL) , p_, p12b, ...
    Formal Test Specification :
    Angle ( a ) :
      1. -361                          [Error]
      2. -360                          [Property p ok, rotate -360]
      3. -359 - +359             [Property p ok, rotates according to input angle]
      4. + 360                       [Property p ok, rotate +360]
      5. + 361                       [Error]
      6. a1                             [Property p ok, should rotate +1]
      7. a12                           [Property p ok, should rotate +12]
      8. a123                         [Property p ok, should rotate +123]
      9. aNULL/a012z       [Error]
    Proportional Constant ( p ) :
      1. 0                               [Error]
      2. 1                               [Property angle ok, regulates with p=1]
      3. 2 - 254                     [Property angle ok, regulates with input p]
      4. 255                           [Property angle ok, regulates with p=255]
      5. p1                             [property angle ok, should regulate with p=1]
      6. p12                           [property angle ok, should regulate with p=12]
      7. pNULL/p02z        [Error]
    Base Choice(BC)
    This criterion is a compromise. A base choice is chosen for each category, and a first base test is formed by using the base choice for each category. The base choice is the simplest, smallest and first usually, also in this report. I should mention that in case of robust test cases(out of boundary), invalid input and test-spec of the environment(less than # number of digits), I have chosen not to combine them with other test cases as the result would be the same. This has reduced the number of test frames and spared much time and effort. This also applies in Each Choice and All Choice combinations.

    #Test CaseDetailsExpected Result
    Actual Results
    1a1Angle = +0361Not to accept the input
    Did not to accept the input
    2a3p3
    Angle = +0120, P Const = +010
    To rotate 120 degrees with fine regulationRotated 120 degrees with fine regulation
    3a7Angle = +90
    The program should read 90 as input
    The program does not crash but registers wrong input value
    4a9Angle = +02by
    The program should handle the invalid input data
    The program crashed
    5p1
    P Const = 0
    The program should not accept p as zero
    The program crashed / division by zero !!!
    6p6P Const = 25
    The program should read 25 as inputThe program does not crash but registers wrong input value
    7p7P Const = zbm
    The program should handle the invalid input data
    The program crashed

    Result : Number of test cases which failed to succeed after running the Base Choice category-partition are 5 out of 7 test cases. Base Choice is the first try on testing the program can margins where the errors can be found. For example in our case we can see that the errors are found on invalid inputs(p=zbm instead of a number)format of entered input(less than #number of digits angle = +25 instead of +0025) and the values around the boundaries on robust testing(p=0 which is the lower boundary  of proportional constant) .
    Each Choice(EC)
    Each choice is stronger than Base Choice, but weaker criterion. One value from each choice for each category must be used at least in one test case.

    #Test CaseDetailsExpected Result
    Actual Results
    1a1Angle = -0361Not to accept the inputDid not to accept the input
    2a2p3
    Angle = -0360, P Const = +010
    To rotate -360 degrees with fine regulation
    Rotated -360 degrees with fine regulation
    3a3p2
    Angle = +0120, P Const = +001
    To rotate +120 degrees with fine regulation
    Rotated around +120 degrees, but seems like precise regulation at extreme low values fails -‌‌ Boundary Test
    4a4p4
    Angle = +0075, P Const = +255
    To rotate +75 degrees with fine regulation
    Rotated around +75 degrees, but seems like precise regulation at extreme high values fails -‌‌ Boundary Test
    5a5
    Angle = +0361
    Not to accept the input
    Did not to accept the input
    6a6Angle = +5
    The program should read 5 as input
    The program does not crash but registers wrong input value
    7a7Angle = +90
    The program  should read 90 as input
    The program does not crash but registers wrong input value
    8a8Angle = +270
    The program should read 270 as input
    The program does not crash but registers wrong input value
    9a9Angle = +02by
    The program should handle the invalid input data
    The program crashed
    10p1
    P Const = 000
    The program should not accept p as zero
    The program crashed / division by zero !!!
    11p5P Const = +3
    The program should read 3 as input
    The program does not crash but registers wrong input value
    12p6P Const = +25
    The program should read 25 as input
    The program does not crash but registers wrong input value
    13p7P Const = zbm
    The program should handle the invalid input data
    The program crashed

    Result : Number of test cases which failed to succeed after running the Each Choice category-partition are 10 out of 13 test cases. The result from this part exactly confirms the results from the first try(Base Choice). Also we can see that the Each Choice has managed to find new kinds of errors which are errors in regulation at boundary values of P(1 and 255).  
    All combinations (AC)
    One value for every choice of every parameter must be used with one value of every (possible) choice of every other category.


    #Test CaseDetailsExpected ResultActual Results
    1a1Angle = -0361Not to accept the input
    Did not to accept the input
    2a2p2
    Angle = -0360, P Const = +001
    To rotate -360 degrees with fine regulation
    Rotated -360 degrees with fine regulation
    3a2p3
    Angle = -0360, P Const = +010
    To rotate -360 degrees with fine regulation
    Rotated -360 degrees with fine regulation
    4a2p4
    Angle = -0360, P Const = +0255
    To rotate -360 degrees with fine regulation
    Rotated around -360 degrees, but seems like precise regulation at extreme high values fails -‌‌ Boundary Test
    5a3p2
    Angle = +0045, P Const = +001
    To rotate +45 degrees with fine regulation
    Rotated around +45 degrees, but seems like precise regulation at extreme low values fails -‌‌ Boundary Test
    6a3p3
    Angle = +0075, P Const = +010
    To rotate +75 degrees with fine regulation
    Rotated +75 degrees with fine regulation
    7a3p4
    Angle = +0125, P Const = +255
    To rotate +125 degrees with fine regulation
    Rotated around +125 degrees, but seems like precise regulation at extreme values fails -‌‌ Boundary Test
    8a4p2
    Angle = +0360, P Const = +001
    To rotate +360 degrees with fine regulation
    Rotated +360 degrees with fine regulation
    9a4p3
    Angle = +0095, P Const = +010
    To rotate +95 degrees with fine regulation
    Rotated +95 degrees with fine regulation
    10a4p4
    Angle = +0360, P Const = 255
    To rotate +360 degrees with fine regulation
    Rotated around +360 degrees, but seems like precise regulation at extreme values fails -‌‌ Boundary Test
    11a5
    Angle = +0361
    Not to accept the input
    Did not to accept the input
    12a6Angle = +5
    The program should read 5 as input
    The program does not crash but registers wrong input value
    13a7
    Angle = +90
    The program should read 90 as input
    The program does not crash but registers wrong input value
    14a8
    Angle = +270
    The program should read 270 as input
    The program does not crash but registers wrong input value
    15a9
    Angle = +02by
    The program should handle the invalid input data
    The program crashed
    16p1
    P Const = 000
    The program should not accept p as  zero
    The program crashed / division by zero !!!
    17p5
    P Const = 3
    The program should read 3 as input
    The program does not crash but registers wrong input value
    18p6
    P Const = 25
    The program should read 25 as input
    The program does not crash but registers wrong input value
    19p7
    P Const = zbm
    The program should handle the invalid input data
    The program crashed

    Result : Number of test cases which failed to succeed after running the All Choices category-partition are 12 out of 19 test cases. All Choices has covered all the possible combinations and here we have even more errors, but it seems the errors result from the same source(categories) that we have found in Base Choice and Each Choice level.

    ANALYSIS OF THE RESULT

    The analysis of the whole result of the category-partition testing is presented in this section. Here we have some charts showing the detailed numbers of test cases ran on the SUT, number of passed results and the failures. I also have made some charts showing the error ratio based on each category compared with the whole overall system.
    FunctionMethodAllPassed
    Failed
    Enable / Disable MotorCategory-Partition220
    RestartCategory-Partition220
    Rotation of the armCategory-Partition
    (All Choices)
    19712
    SUMCategory-Partition231112
    Number of test cases in each function
    As it can be seen here the whole number of test cases ran on the system(All Choices is just taken into consideration from the 3rd function) is 23 test cases which resulted in 11 passed cases and 12 failed. This brings our overall error rate to 52.17%
    My main focus on the result part is on the 3rd function which is rotation of the arm. This function is the most important part of the implementation and without it the whole system is not usable. Down here I have also gathered detailed results of the tests on this function(Rotation of the robot arm). Here we can see detailed number of test cases ran on the SUT and the result both in numbers and percentage of the failed rate.

    MethodAllPassed
    Failed
    Base Choice725
    Each Choice133
    10
    All Choices19712
    Number of test cases in each of category-partition methods for Rotation of the robot arm function

    MethodFailed Percentage
    Base Choice71.4 %
    Each Choice76.9 %
    All Choices63.1 %
    Error rate in each of category-partition methods for Rotation of the robot arm function

    As we can see in the statistics that the percentage of discovered errors first rises which looks normal as we have raised number of our test cases and found more errors while testing the faulty categories. This process does not continue when we compare Each Choice results with All Choices. Detailed information says that actually percentage of discovered errors fell from 76.9 down to 63.1 while we have added to the number of test cases testing all the categories. This does not look normal, but I think I have a good idea about what is happening. My explanation about this phenomenon is what I mentioned earlier, where I mentioned that for saving time and effort(as done in the lecture) I have not done the Cartesian multiplication for invalid types and environment-spec test cases. While this might mis-lead us if we don't take what I mentioned earlier in consideration but this last results shows that the program works just fine doing what it is implemented for(with few errors on extreme values of proportional constant), rotating to the pre-defined angle, but has big issues when checking on invalid datatypes and system environment checks(flexibility in reading information from user in different numbers of digit, while it does not mention that the input data should have been in four digits). Error rates, while checking these categories mostly(Base Choice and Each Choice) can get to almost 77%. This is also confirmed by the results of Errors found pr. each category. As we can see in the chart down here the number of "in range" errors stays almost constant(the source was discovered in Each Choice) while it is being multiplied by all other choices in other categories, while others rise or stays constant as they have not been multiplied by other choices  in other categories like "in range". 
    If we had done the  Cartesian multiplication in case of those faulty choices the number of errors in the "All Choices" method would get up to around 84 errors. which would make the previous chart to look like this :

    LESSONS LEARNED AND OPEN ISSUES

    The lessons I learned from this project was that usually the errors do not happen in where the main focus is on(rotation in the normal workspace here), but happens on the boundaries(divisions by 0, malfunction in boundary values for sensitive variables), or when the input data is not expected(invalid type, or the wrong format). This can result either into huge differences in results or even terminating of the whole program which in big control programs can even result in loss of lives. In our case the program can not read +5 and it should be written +0005 !!! This is not mentioned anywhere in the help, or during running the program and can result into that the program reads the following input as 3049 !!! I am still not sure what I did in avoiding the Cartesian multiplication of the faulty categories was the right thing to do. I saw this in the page 37 of the INF4290 BB Testing and continued according to it. I can see that it saved me lots of time and effort, but changed the result of my report, which if not considered can mis-lead the reader. So at least for me still this is an open issuse that if these kind of glitches are acceptable in doing testing or I should have been more accurate ?!


    REFERENCES


    No comments:

    Post a Comment