Test System and Methodology
Our approach testing storage is designed specifically to target long-term performance with a high level of granularity. Many testing methods record peak and average measurements during the test period. These average values give a basic understanding of performance but fall short in providing the clearest view possible of I/O Quality of Service (QoS).
'Average' results do little to indicate the performance variability experienced during actual deployment. The degree of variability is especially pertinent as many applications can hang or lag as they wait for I/O requests to complete. This testing methodology illustrates performance variability and includes average measurements during the measurement window.
While under load, all storage solutions deliver variable levels of performance. While this fluctuation is normal, the degree of variability is what separates enterprise storage solutions from typical client-side hardware. Providing ongoing measurements from our workloads with one-second reporting intervals illustrates product differentiation in relation to I/O QoS. Scatter charts give readers a basic understanding of I/O latency distribution without directly observing numerous graphs.
Consistent latency is the goal of every storage solution, and measurements such as Maximum Latency only illuminate the single longest I/O received during testing. This can be misleading as a single 'outlying I/O' can skew the view of an otherwise superb solution. Standard Deviation measurements consider latency distribution but do not always effectively illustrate I/O distribution with enough granularity to provide a clear picture of system performance. We use histograms to illuminate the latency of every single I/O issued during our test runs.
We measure power consumption during test runs. This provides measurements in time-based fashion, with results every second, to illuminate power consumption behavior. This significantly affects the TCO of the storage solution. We also present IOPS-to-Watts measurements to highlight the efficiency of the storage solution.
We conduct our tests over the full LBA range to allow each HDD to highlight its average performance. The Seagate Enterprise Performance v7 is a 1.2TB capacity drive, while the WD Xe and the Toshiba AL13SEB900 are both 900GB drives. These varying capacities should be taken into account when viewing results. The first page of results provides the 'key' to understanding and interpreting our test methodology.