Hardware compatibility test cases are usually a little different than the manual functionality test cases we often employ. Since compatibility testing looks mainly at interactions between hardware and software, the test cases tend to be less granular on a functional level, and instead more geared towards measuring variables, such as frame rate, in the application under test.
Compatibility testing efforts typically hub off of a predefined hardware configuration matrix that defines the specific configurations to be tested, and provides a location for recording and maintaining data/test results. There are usually very few “click this, type that, go to the next screen tests,” but testing is often spread across multiple modules or areas of an application, especially when looking at performance metrics that can vary with things like scene geometry and application of lighting and effects.
Here is an example of the type of hardware matrix and results that we typically produce when looking mainly at configuration-specific performance. This was for a DirectX 11 game, with an eye towards establishing playable minimum configurations, hence the large number of blank results cells. The matrix contains a lot of data; scroll to the right for the metrics being collected and the results. The snazzy cover sheet summarizing the results was added later, and isn’t part of our typical data collection.
In some cases, we are provided with existing test cases that we utilize, especially when testing is intended to normalize the test path, such as with a productivity application. Here is an example of our lab hardware and test results being integrated into an existing set of test cases. This project was a small WebGL tie-in to a much larger OpenGL desktop application, hence the inclusion of web browser as part of the configuration information.
Feel free to contact us if you have any questions or are interested in leveraging our software quality assurance services for your next project.
Leave a Reply