Test System and Methodology
Our approach to storage testing targets long-term performance with a high level of granularity. Many testing methods record peak and average measurements during the test period. These average values give a basic understanding of performance, but fall short in providing the clearest view possible of I/O QoS (Quality of Service).
While under load, all storage solutions deliver variable levels of performance. "Average" results do little to indicate performance variability experienced during actual deployment. The degree of variability is especially pertinent, as many applications can hang or lag as they wait for I/O requests to complete. While this fluctuation is normal, the degree of variability is what separates enterprise storage solutions from typical client-side hardware.
Providing ongoing measurements from our workloads with one-second reporting intervals illustrates product differentiation in relation to I/O QoS. Scatter charts give readers a basic understanding of I/O latency distribution without directly observing numerous graphs. This testing methodology illustrates performance variability, and includes average measurements during the measurement window.
IOPS data that ignores latency is useless. Consistent latency is the goal of every storage solution, and measurements such as Maximum Latency only illuminate the single longest I/O received during testing. This can be misleading, as a single "outlying I/O" can skew the view of an otherwise superb solution. Standard Deviation measurements consider latency distribution, but do not always effectively illustrate I/O distribution with enough granularity to provide a clear picture of system performance. We utilize high granularity I/O latency charts to illuminate performance during our test runs.
We conduct our tests over the full LBA range to allow each HDD to highlight its average performance. All test samples feature 4TB of capacity, but the WD Red Pro and SE models feature a higher 7,200 RPM speed. The first page of results will provide the "key" to understanding and interpreting our test methodology.
PRICING: You can find products similar to this one for sale below.
United States: Find other tech and computer products like this over at Amazon's website.
United Kingdom: Find other tech and computer products like this over at Amazon UK's website.
Canada: Find other tech and computer products like this over at Amazon Canada's website.
- Page 1 [Introduction]
- Page 2 [WD Red Pro Internals and Specifications]
- Page 3 [Test System and Methodology]
- Page 4 [Benchmarks - 4k Random Read/Write]
- Page 5 [Benchmarks - 8k Random Read/Write]
- Page 6 [Benchmarks - 128k Sequential Read/Write]
- Page 7 [Benchmarks - Database/OLTP and File Server]
- Page 8 [Benchmarks - Email Server]
- Page 9 [Final Thoughts]
- We at TweakTown openly invite the companies who provide us with review samples / who are mentioned or discussed to express their opinion of our content. If any company representative wishes to respond, we will publish the response here.
Latest News Posts
- Red Dead Redemption 2 built from scratch for consoles
- Square Enix to embrace games as a service business model
- Red Dead 2 delay won't interrupt future Rockstar games
- Far Cry 5 co-op supports full campaign
- Far Cry 5: Everything you need to know
- Gigabyte z170 UD5: Question about Voltage Spikes/Offset
- ASRock AliveXFire-esata2 compatibility with Phenom II X4 980
- m4a88td-m/usb3 & Windows 10
- OWC ThunderBay 4 Mini Thunderbolt 2 Review
- Prey benchmarked: Radeon RX 580 vs. GeForce GTX 1060
- Qualcomm fuels IoT growth by currently delivering more than 1 million chips a day into a wide range of connected applications
- Team Group announces theme for COMPUTEX 2017 showcase: go beyond the limit and reach for the top
- SAPPHIRE announces PULSE Radeon RX 560 graphics card
- ELITEGROUP computer to stand out at Computex for its smart campus deployment, robotic technology, education laptops, tablets, mini PC, and motherboards
- ADATA Shares a Symphony of Technology at Computex 2017