Client v Enterprise Specifications
The disparity in performance specifications is a result of the test protocols used for consumer and enterprise SSD performance measurements. The root of the differing test methodology lies in the characteristics of the intended environment.
The JEDEC specification for client SSDs calls for only eight hours of active use per day. Even more importantly, client SSDs are tuned for light workloads in 'bursty' environments. The SSD is at idle, or near idle, for the majority of its use in a consumer environment. This allows the SSDs internal functions to keep performance 'fresh' by utilizing a number of background processes. The internal mechanisms of the SSD are designed and optimized for this type of environment, and thus functions accordingly.
Client SSDs utilize extra spare area for internal functions, and are rarely used at full capacity, further boosting their performance. The abundance of free space is complemented by the active use of the TRIM command. The TRIM command allows the SSD to maintain high performance levels.
The relatively stress free environment of a client SSD allows it to enjoy much higher performance during operation, and this is reflected in the manner in which their performance is measured and marketed. Typically client SSD performance is measured in FOB (Fresh out of Box) conditions, with no preconditioning, and with very little (typically 8-10GB) of the available LBA utilized.
JEDEC specifications for enterprise SSDs are much more stringent, calling for 24 hours of active use per day. Utilizing enterprise SSDs at full capacity is commonplace to maximize the ROI of the premium tier of storage. Recording performance metrics after a protracted preconditioning cycle, and with utilization of the entire LBA range, reflects the workload environment. The use of preconditioning and the entire capacity of the SSD alters the result of the testing drastically, leaving us with lower, more realistic performance specifications.
The lack of the TRIM command is also an important consideration in the comparison of client and enterprise hardware. TRIM is not utilized in the majority of enterprise applications and the removal of its use results in markedly lower performance with any SSD.
When users compare the performance of a top-shelf consumer SSD to an enterprise SSD there appears to be similarities, and in some cases, client hardware appears to have even higher performance than the enterprise hardware. In deployment, the performance of the client SSD will drop dramatically. Some consumer SSDs excel at pure read or write workloads, but the introduction of any mixed workloads creates a dramatic drop in performance. In our testing, we will also be able to observe the massive difference in latency distribution.
Adding to the problem, consumer SSDs aren't tested in uniform fashion with industry approved tools. With the help of many major manufacturers, SNIA has set forth separate testing methodologies for both client and enterprise SSDs. While the enterprise portion of the market adheres to these basic tenets of SNIA methodology, the consumer market largely disregards the testing methodology tailored for their use.
This leads to blatantly misleading specifications that are not representative of performance in a typical consumer workload, let alone the harsh realities of enterprise workloads. The fact that the test results are released with full read and write workloads, and no mixed workloads, leads to even more misleading results.