Test System and Methodology
Our Enterprise Test Bench is designed specifically to target long-term performance with a high level of granularity. Many testing methods record peak and average measurements during the test period. These average values give a basic understanding of performance, but fall short in providing the clearest view possible of I/O QoS (Quality of Service).
'Average' results do little to indicate the performance variability experienced during actual deployment. The degree of variability is especially pertinent, as many applications can hang or lag as they wait for I/O requests to complete. This testing methodology illustrates performance variability, and includes average measurements, during the measurement window.
While under load, all storage solutions deliver variable levels of performance. While this fluctuation is normal, the degree of variability is what separates enterprise storage solutions from typical client-side hardware. Providing ongoing measurements from our workloads with one-second reporting intervals illustrates product differentiation in relation to I/O QOS. Scatter charts give readers a basic understanding of I/O latency distribution without directly observing numerous graphs.
Consistent latency is the goal of every storage solution, and measurements such as Maximum Latency only illuminate the single longest I/O received during testing. This can be misleading, as a single 'outlying I/O' can skew the view of an otherwise superb solution. Standard Deviation measurements consider latency distribution, but do not always effectively illustrate I/O distribution with enough granularity to provide a clear picture of system performance. We use histograms to illuminate the latency of every I/O issued during our test runs.
We also measure power consumption during test runs. This provides measurements in time-based fashion, with results every second, to illuminate the behavior of power consumption under typical workloads. Power consumption can cost more over the life of the device than the initial acquisition price of the hardware itself. This significantly affects the TCO of the storage solution. We also present IOPS-to-Watts measurements to highlight the efficiency of the storage solution.
We will be utilizing tests specifically tailored to test SSHD's, and we will conduct the remainder of our tests over the full LBA range to allow each HDD to highlight its average performance. Both Seagate HDD's are 600GB, while the MK4001GRRB from Toshiba is only 147GB. This should be taken into consideration when viewing test results. The first page of standard test results will provide the key to understanding our testing methodology.
Update: Unless otherwise noted performance results for the Turbo SSHD are conducted over 33% of the LBA range to highlight caching performance.