Technology content trusted in North America and globally since 1999
8,497 Reviews & Articles | 65,575 News Posts

Seagate 16TB IronWolf, IronWolf Pro, and Exos X16 HDD Review (Page 2)

By Chris Ramseyer from Aug 11, 2019 @ 16:12 CDT
TweakTown Rating: 95%Manufacturer: Seagate

CIFS Performance Testing

Testing Notes


Today we get to use our QSAN XCubeNAS XN8012R for the first time outside of its feature review. This is my system of choice for testing hard disk drives, solid state drives, and advanced cache products designed for use in storage servers.

The system connects to our Supermicro SSE-X3348TR switch with both 10GbE and 40GbE connectivity. The workload comes from a modified Quanta (QCT) MESOS CB220 server.

In the NAS, we use eight drives from each series in a RAID 6 array without a SSD cache. The QSAN XN8012R uses the ZFS file system and a 10-gigabit Ethernet connection to the network.

Sequential Read Performance


Hard disk drives deliver inconsistent performance under heavy workloads compared to enterprise solid state drives. The sequential read charts shows us that as we pull data from the arrays with increasing intensity. Each dot represents an IO on the chart.

The two IronWolf Series drives outperform the Exos X slightly in the sequential read test, but there are plenty of outliers with all three series.

Sequential Write Performance


Even though we don't have flash sitting in front of the arrays today, we still show the preconditioning and steady-state charts that will allow you to compare these three products to other products and array types later.

In the sequential write test, we see very similar performance between the three arrays. The IronWolf Pro and Exos X show a distinct improvement over the base IronWolf at 4 OIO, though.

Sequential Mixed Workloads


The mixed workload and 70% read charts show us more of the performance inconsistency at very high queue depths. These are worse case numbers for HDDs since the heavy workloads compounds latency between each IO.

Random Read Performance


We start to see significant performance variation in the random workloads. The first thing you will clearly notice is the Exos walking away from the two IronWolf models and in some workloads double and tripling random read performance. Less obvious is the IronWolf outperforming the IronWolf Pro. For many, this is unexpected, but we've noticed the IronWolf performing better in some workloads over the years with other capacities.

The IronWolf Pro uses optimizations to ensure steady and reliable performance in environments with more vibration. Slightly reducing performance in some key areas allows that to happen. If we tested the IronWolf and IronWolf Pro in a rack full of storage systems or even with 24 drives in a server, the IronWolf Pro would perform better than the base model.

Random Write Performance


We see similar findings in the random write tests. The Exos quickly surpasses the IronWolf models and becomes the real value leader for random workloads. The base IronWolf slightly overtakes the IronWolf Pro in our eight-drive array, but if we used a larger array, the two would reverse with the Pro extending the lead with each drive we add to the system.

Random Mixed Workloads


Random 4KB mixed workloads, and the 70% read test, give us a good indication of virtualized desktops running off-network storage. This, as well as database and miscellaneous cloud storage, are where the Exos X stands tall. The two IronWolf products still perform well for their respected markets. Most IronWolf drives simply fall into mass storage roles hold cold data for end-users be them consumers, creators, or businesses.

Buy at Amazon

Seagate IronWolf 16TB NAS Internal Hard Drive HDD

Today Yesterday 7 days ago 30 days ago
$476.99 $743.00 $588.13
* Prices last scanned on 11/21/2019 at 1:22 am CDT - prices may not be accurate, please click for very latest pricing
We openly invite the companies who provide us with review samples / who are mentioned or discussed to express their opinion. If any company representative wishes to respond, we will publish the response here. Please contact us if you wish to respond.

Related Tags