Exploring and Customizing Class Test Results
The Class Testing UI provides you with a variety of ways to explore test results and to customize what results are reported the next time the test is run. Some actions that you might want to perform when viewing results include:
- View/evaluate test cases: To view and/or evaluate the automatically-generated and user-defined test cases used to test this class, click the View tool bar button.
- View the source responsible for a rule violation or exception: To view the source of the error, with the problematic line highlighted, double-click the file/line information for the error in the Errors Found panel.
- Edit your class: To open your class in a text editor, right-click the Source button, then choose Edit Source from the shortcut menu.
- View a description of a violated rule: To view a description of a violated static analysis rule, along with an example of that rule and a suggested repair, right-click a static analysis error message that has a wizard hat or bug icon, then choose View Rule Description from the shortcut menu.
- View the stack trace of an uncaught runtime exception: To view a stack trace like the one that the Java virtual machine would give if a reported uncaught runtime exception were thrown open the branch containing the uncaught runtime exception.
- View the calling sequence of an uncaught runtime exception: To view an example usage of the class that leads to the reported uncaught runtime exception, open any uncaught runtime exception's Test Case Input branch.
- View the error-causing input: To view the error-causing input, open any specification or regression testing error or uncaught runtime exception error's Test Case Input branch.
- View an example test case: To view an example Java program that executes the input for a test case, right-click the Test Case Input node, then choose View Example Test Case from the shortcut menu.
If an exception has been reported for this input, the exception will be thrown when you run this program. Sometimes (for example, while testing an abstract class) the input that Jtest finds doesn't correspond to a compilable Java program. If the input includes stubs, the generated .java program will include only the stub text.
- View a report: To view a report file, click Report.
- View metrics: To view class metrics, click Metrics.
- Gauge coverage: There are two ways to gauge test coverage:
- Review the coverage data displayed in the Test Progress panel.
- Display and review the report file.
- The report file contains, among other information, the annotated source code for the tested class. This may be used to determine what lines Jtest tested and what lines it did not test.
A method is designated "covered" if Jtest automatically tests any part of the constructor. Jtest also reports branch coverage. For example, if a piece of code has an if statement, there are two branches; if one branch is covered (the true path), Jtest reports 50% coverage of that branch.
- Modify test case evaluation: If a reported uncaught runtime exception is actually the correct behavior of the class, or if you want Jtest to ignore the outcome of an input while checking for specification and regression errors, right-click the error message with the bug icon, then choose the appropriate command from the shortcut menu.
- To indicate that a reported error is not an error, choose Not an Error.
- To tell Jtest to ignore the outcome for this input, choose Ignore this Outcome. If you choose this option, the outcome will not be used for comparisons when searching for specification or regression errors. Also, no uncaught runtime exceptions will be reported for this test case input.
- To have Jtest ignore all outcomes in a class's Uncaught Runtime Exceptions or Specification and Regression Errors node, or to indicate that all errors contained in a class's Uncaught Runtime Exceptions or Specification and Regression Errors node are not actual errors, right-click the appropriate node and choose Set All to: Not an Error or Set All to: Ignore this Outcome.
- Suppress messages: To suppress the reporting of a single, specific exception or static analysis violation, right-click the message (with the bug icon) related to the error/violation that you do not want reported in future test runs, then choose Suppress from the shortcut menu. This automatically adds the suppression to the appropriate Suppressions Table (for dynamic analysis messages) or Suppressions List (for static analysis messages).
|