Viewing and Validating Test Cases
In the View Test Cases window, you can view and validate the test cases that Jtest used for Dynamic Analysis.
To open this window from the Class Testing UI, click the View Test Cases button.
To open this window from the Project Testing UI's Results panel, right-click the [Class Name] node, then choose View Test Cases from the shortcut menu.
About the View Test Cases Window
The View Test Cases window contains the following nodes:
Test cases for [classname]
Contains the test cases that Jtest generated and executed in this class's most recent test.
Automatic Test Cases
Contains the test cases that Jtest generated automatically. Only the test cases that do something new (e.g., increase coverage, throw a new exception, etc.) are shown.
[method name]
Contains test cases for this method.
Test Case
Contains all of the information for a test case.
Test Case Input
Contains input that defines the test case.
The input for automatic test cases is the calling sequence.
The input for user defined test cases is the input for each argument.
If stubs were used, they will be listed here. Empty boxes indicate automatically generated stubs. Black boxes indicate user-defined stubs. For more information on stubs, see Testing Classes That Reference External Resources and Using Custom Stubs.
Outcomes
Contains outcomes for this test case. Verify if the outcomes are correct or incorrect according to the class specification and set their state using the shortcut menus.
When the outcome is an object, Jtest automatically chooses the toString method to show its state.
If a method named jtestInspector is defined for the object's class, Jtest will only use the return value of this method to show the object state.
If no toString or jtestInspector methods are defined, Jtest will heuristically choose some public instance methods for that object to show its state.
If the method under test is a static method, Jtest will heuristically choose public static methods to show the class state. If the methods Jtest chose are not enough, declare a static method called sjtestInspector for the class. Jtest will use the return value of this method to show the object class.
[n]= number of outcomes for this test case.
Exception
Indicates whether an exception occurred, and, if so, what type of exception occurred.
If an exception was suppressed, you can see the reason for the suppression by right-clicking the exception message node and choosing Why Suppressed? from the shortcut menu.
User Defined Test Cases
Contains test cases generated from user-defined input.
Method Inputs
Contains test cases generated from method inputs.
[method name]
Contains test cases for this method.
Test Case
Contains all of the information for a test case.
Test Case Input
Contains input that defines the test case.
The input for automatic test cases is the calling sequence.
The input for user defined test cases is the input for each argument.
If stubs were used, they will be listed here. Empty boxes indicate automatically generated stubs. Black boxes indicate user-defined stubs. To see the stack trace where a stub invocation occurred, expand the stub's branch. For more information on stubs, see Testing Classes That Reference External Resources and Using Custom Stubs.
Outcomes
Contains outcomes for this test case. Verify if the outcomes are correct or incorrect according to the class specification and set their state using the shortcut menus.
When the outcome is an object, Jtest automatically chooses the toString method to show its state.
If a method named jtestInspector is defined for the object's class, Jtest will only use the return value of this method to show the object state.
If no toString or jtestInspector methods are defined, Jtest will heuristically choose some public instance methods for that object to show its state.
If the method under test is a static method, Jtest will heuristically choose public static methods to show the class state. If the methods Jtest chose are not enough, declare a static method called sjtestInspector for the class. Jtest will use the return value of this method to show the object class.
[n]= number of outcomes for this test case.
Exception
Indicates whether an exception occurred, and, if so, what type of exception occurred. When an exception occurs, stack trace information can be displayed by opening this node.
If an exception was suppressed, you can see the reason for the suppression by right-clicking the exception message node and choosing Why Suppressed? from the shortcut menu.
Test Classes
Contains the number of test cases added from test classes.
If you change specification and regression test cases and want to restore the set used during the actual tests, right-click the Specification and Regression Test Cases node, then choose the Reload option from the shortcut menu. Jtest will then reload the original test cases.
The color of the arrow to the left of a leaf has the following meaning:
- green: The outcome is correct, or has been validated as correct by the user.
- red: The outcome is incorrect (or has been validated as incorrect by the user), or an uncaught runtime exception was detected.
- gray: The outcome status is unknown, and no uncaught runtime exceptions were detected.
- no arrow: The user has specified to ignore this outcome.
If the Perform Automatic Regression Testing flag is selected, Jtest will assume that gray outcomes are correct and will report an error if the outcome changes.
In this window, the outcome is marked as incorrect if it is different than the one in the Specification and Regression Test Cases branch of the Errors Found panel (in the Class Testing UI) or Results panel (in the Project Testing UI). When more than one test case outcome differs, only one of them is marked as an error and only that one is reported as an error in the Errors Found panel or Results panel.
Validating Outcomes
Indicate whether or not the outcome for each test case is correct by right-clicking the appropriate outcome node, then choosing Mark as Correct (if the listed outcome is the expected outcome), Mark as Incorrect (if the listed outcome is not the expected outcome), Mark as Unknown (if you don't know how the listed outcome compares to the expected outcome), or Mark as Ignore (if you want Jtest to ignore the listed outcome) from the shortcut menu.
To ignore an entire test, right-click the appropriate Test Case node, and choose Ignore this Test Case from the shortcut menu. To tell Jtest to stop ignoring a test case you previously told it to ignore, right-click the appropriate Test Case node, and choose Do Not Ignore this Test Case from the shortcut menu.
To evaluate all of a test case's outcomes with one click, right-click the appropriate Outcomes leaf, then choose the appropriate Set All to... command from the shortcut menu.
To remove an entire test case, right-click the appropriate Test Case node, then choose Delete from the shortcut menu.
To remove the entire set of test cases, right-click the Specification and Regression Test Cases node, then choose Delete All from the shortcut menu.
To indicate the correct outcome for a test case:
- Open the Class Test Parameters window.
- Open that test case's branch in Dynamic Analysis> Test Case Evaluation> Specification and Regression Testing.
- Right-click the outcome, choose Edit from the shortcut menu, then enter the correct value in the text field that opens.
|