<h2>Testing Scope </h2>
<h3> 1. What's in scope </h3>
Testing scope: hotspot jimage calls defined in the Hotspot header jimage.hpp.
<h3>2. What's out of scope </h3>
<h2>Tests Inventory </h2>
<h3>1. Existing tests </h3>
* *jdk/test/jdk/internal/jimage/JImageReadTest.java* - Unit test for libjimage JIMAGE_Open/Read/Close
<h3>2. New tests</h3>
* Various positive test scenarios with basic functionality checking
* Various negative test scenarios with incorrect/corrapted/non-existent data (both jimage file and income parameters)
See *Feature Cases, Tests and Test Cases* section.
<h3>3. Deprecated tests</h3>
* Previous implementation tests (see https://bugs.openjdk.java.net/browse/JDK-8077725) - **already removed**.
<h2>Feature Cases, Tests and Test Cases</h2>
**TBD**
* JIMAGE_Open
- positive: multiple open of the same jimage file
- negative: open non-existing jimage file
- negative: open empty jimage file
- negative: open corrupted (incorrect magic) jimage file
* JIMAGE_Close
- positive: multiple closes with one image handle
- negative: close with wrong handle
* JIMAGE_PackageToModule
- positive: non-existing package
- negative: null package value
- negative: incorrect/null handle value
* JIMAGE_FindResource
- negative: corrupted/null handle
- negative: wrong/null module name
- negative: wrong/null class/resource name
- negative: incorrect/null version
- negative: null instead of size array
* JIMAGE_GetResource
- negative: corrupted/null handle
- negative: corrupted/null location ref
- negative: insufficient size of data buffer (zero bytes, non-zero less than resource size, null)
- negative: wrong size value (negative, zero, positive but less than real size)
* JIMAGE_ResourceIterator
- negative: call JIMAGE_Resources with corrupted/null handle
- negative: call JIMAGE_Resources with null array for names
- **NOTE: class ImageNativeSubstrate.java does not provide more access to iterator**
<h2> Success criteria </h2>
*Describe criteria which would be used to consider the testing successful. Some examples are provided below. Please note that these are only examples of success metrics. Pick what's applicable, add other ones which apply better. Update "result" column at the end of test development for future reference*
|Criteria|Planned|Status|Result|Comments|
|-|-|-|-|-|
| # of new tests | | | | |
| Public API coverage | *approx 100%* | | | |
| Block coverage of new code | *approx 100%* | | | *native or Java* |
| Test stability | 100 runs with no failures | | | *make sure there are no intermittent failures.* |
| Open test issues| < 2 | | | *makes sure all the tests are ready* |
| Pass rate | 95% | | | *makes sure the code is in a good shape* |
<h3>Dependencies</h3>
No dependencies.
<h3>Risks</h3>
No risks.