Jtest logo




Contents  Previous  Next  Index

Exploring and Customizing Project Test Results


The Project Testing UI provides you with a variety of ways to explore test results, as well as to customize what results are reported the next time this test is run. Most options are accessed via shortcut menus in the lower Results window. Some actions that you might want to perform when viewing results include:

  • View/evaluate test cases: To view and/or evaluate the automatically-generated and user-defined test cases used to test this class, right-click either the [Class Name] node, or the [Class Name]> Test Progress> Dynamic Analysis> Number of Test Cases Executed node, then choose View Test Cases from the shortcut menu.
  • View the source responsible for a rule violation or exception: To view the source of the error, with the problematic line highlighted, double-click the file/line information for any error contained within the Errors Found branch.
  • View a description of a violated rule: To view a description of a violated static analysis rule, along with an example of that rule and a suggested repair, right-click a static analysis error message with a wizard or bug icon, then choose View Rule Description from the shortcut menu.
  • View the stack trace of an uncaught runtime exception: To view a stack trace like the one that the Java virtual machine would give if a reported uncaught runtime exception were thrown, open the uncaught runtime exception's branch.
  • View the calling sequence of an uncaught runtime exception: To view an example usage of the class that leads to the reported uncaught runtime exception, open the uncaught runtime exception's Test Case Input branch.
  • View error-causing input: To view the error-causing input, open any specification or regression testing error or uncaught runtime exception error's Test Case Input branch.
  • View an example test case: To view an example Java program that executes the input for a test case, right-click the Test Case Input node, then choose View Example Test Case from the shortcut menu.

    If an exception has been reported for this input, the exception will be thrown when you run this program.

    Sometimes (for example, while testing an abstract class) the input that Jtest finds doesn't correspond to a compilable Java program.

    If the input includes Stubs, the generated .java program will include only the stub text.
  • View a report: To view a report file, click the Report button in the Project Testing UI tool bar.
  • View metrics: To view project and average class metrics, click Metrics. To view a specific class's metrics, right-click that class's node in the Results panel, then choose View Class Metrics.
  • Gauge coverage: To review coverage, open the [Class Name]> Test Progress> Dynamic Analysis> Total Coverage node. A method is designated "covered" if Jtest automatically tests any part of the constructor. Jtest also reports branch coverage. For example, if a piece of code has an "if" statement, there are two branches; if one branch is covered (the true path), Jtest reports 50% coverage of that branch.
  • Modify test case evaluation: If a reported dynamic analysis error is actually the correct behavior of the class, or if you want Jtest to ignore the outcome of an input while checking for specification and regression errors, right-click the error message with the bug icon, then choose the desired command from the shortcut menu.
    • To indicate that a reported error is not an error, choose Not an Error.
    • To tell Jtest to ignore the outcome for this input, choose Ignore this Outcome. If you choose this option, the outcome will not be used for comparisons when searching for specification or regression errors. Also, no uncaught runtime exceptions will be reported for this test case input.
      To have Jtest ignore all outcomes in a class's Uncaught Runtime Exceptions or Specification and Regression Errors node, or to indicate that all errors contained in a class's Uncaught Runtime Exceptions or Specification and Regression Errors node are not actual errors, right-click the appropriate node and choose Set All to: Not an Error or Set All to: Ignore this Outcome.
  • Suppress messages: To suppress the reporting of a single, specific exception or static analysis violation, right-click the error message that you do not want reported in future test runs, then choose Suppress from the shortcut menu. This automatically adds the suppression to the appropriate Suppressions Table (for dynamic analysis messages) or Suppressions List (for static analysis messages).
  • Remove a class's results from the Results panel and Results folder: To remove the results of a class from both the Results panel and the Results Folder (which stores project test results), right-click the [Class Name] node, then choose Delete from the shortcut menu.
  • Edit Class Test Parameters: To modify a specific class's Class Test Parameters, right-click the [Class Name] node, then choose Edit Class Test Parameters from the shortcut menu.
  • Load in Class Testing UI: To focus on the errors for a single class, view the class in the Class Testing UI by right-clicking the [Class Name] node, then choosing Load in Class Testing UI from the shortcut menu.

Contents  Previous  Next  Index

ParaSoft logo
(888) 305-0041 info@parasoft.com Copyright © 1996-2001 ParaSoft